Applus IDIADA proving ground goes digital

27/06/2018
    ‘Digital Twin’ of Applus+ IDIADA proving ground paves the way for real-world trials of autonomous vehicles
    Hertfordshire, UK, 26 June 2018... UK software specialist, rFpro, is developing a highly accurate virtual model of Applus+ IDIADA’s proving ground to be used for the development of vehicles in simulation. The Digital Twin of the proving ground enables vehicle manufacturers to accelerate the development of ADAS and CAVs (Connected Autonomous Vehicles) by testing them in a fully representative virtual environment before validation on the track.
     
    “Using a virtual environment is the only cost-effective way to subject these self-learning systems to the limitless number of scenarios that can occur in the real world,” says Chris Hoyle, Technical Director, rFpro. “Identical scenarios can be choreographed at the proving ground to validate the simulation results, allowing customers to confidently progress to real-world trials. Our virtual model is a vital part of the road map for the development of CAVs at Applus+ IDIADA within the regulatory framework.”
     
    rFpro’s Applus+ IDIADA Digital Twin will be the latest addition to its library of digital models, which is the world’s largest, and includes proving grounds, test tracks and thousands of kilometres of varying real roads. rFpro is the industry’s most open simulation software package, capable of use with a wide range of vehicle models and driving simulation platforms.
     
    “The Applus+ IDIADA facility already provides a safe environment for the controlled testing of autonomous functionality and a natural, ‘real world’, extension to customers’ software engineering processes,” says Javier Gutierrez, Applus+ IDIADA Project Manager, Chassis Development Vehicle Dynamics. “By investing in a digital model, we can also become an integrated part of our customers’ continuous software development tool-chain, significantly reducing the development and validation time, and therefore the cost, of autonomous systems.”
     
    The digital models created by rFpro can be populated by ego vehicles (the customer’s vehicles) as well as by semi-intelligent Swarm traffic and Programmed traffic. Vehicles and pedestrians can share the road network correctly with perfectly synchronised traffic and pedestrian signals, following the rules of the road, while also allowing ad-hoc behaviour, such as pedestrians stepping into the road, to provoke an emergency. This allows digital experiments to precisely mirror the physical tests conducted on the proving ground with robot soft targets.
     
    Early adopters of testing autonomous vehicles in a virtual environment are already carrying out more than 2 million miles of testing per month, but in order to be effective as a development tool, the simulation must correlate accurately with the real world. rFpro is an industry leader in this respect; by utilising phase-based laser scanning survey data it can create models with an accuracy of around 1mm in Z (height) and in X and Y (position).
     
    A key element of rFpro’s software is its TerrainServer surface model that enables a High Definition surface to be simulated. By capturing detailed surface information that is missed by point-based sampling methods, TerrainServer allows very high correlation with the actual road surfaces used during ‘real world’ testing. This extends the use of the digital model into Vehicle Dynamics applications, allowing ride and secondary ride experiments to be conducted by real-time models on driving simulators.
     
    Unlike other virtual models, rFpro use exceptionally high quality, realistic rendering, up to HDR-32, which is essential for the training, testing and validation of deep-learning based ADAS and autonomous systems, as Hoyle explains. “The resolution and dynamic range of camera sensors are increasing every year, so it is important to be able to render high resolution HDR32 video in real-time. Our system is also unique in avoiding the patterns and video artefacts that arise in synthetic simulation tools, which would otherwise impair deep learning training performance.”
    Lighting is modelled accurately for Applus+ IDIADA’s latitude an
    d longitude, day of the year, time of day, atmospheric and weather conditions. This includes circumstances such as the transition between poorly-lit and well-lit roads, the effect of the sun low in the sky or the approaching headlights of oncoming traffic, all of which can be particularly challenging for ADAS and autonomous vehicle sensors.

    Applus+ uses first-party and third-party cookies for analytical purposes and to show you personalized advertising based on a profile drawn up based on your browsing habits (eg. visited websites). You can accept all cookies by pressing the "Accept" button or configure or reject their use. Consult our Cookies Policy for more information.

    Cookie settings panel