A new data compression framework for single-photon lidar offering step-change efficiency gains in data processing and on-chip memory requirements without any significant loss of depth information. This innovative approach overcomes restrictive data transfer and computation bottlenecks to enable advances in lidar time-of-flight (ToF) imaging systems.
Single-photon lidar is rapidly becoming a key enabler for emerging applications requiring depth imaging information, including autonomous driving, advanced robotics, defence systems, smart retail and home solutions, and industrial automation. These use-cases are driving bigger and faster detector systems, which in turn are generating massive and ever-expanding amounts of data at increasing frame rates. Conventional ToF lidar systems involve the generation of a histogram on-chip, however, this now represents a critical data transfer and computational processing challenge. Alleviating these bottlenecks requires an unwanted trade-off – sacrificing potentially important depth resolution to reduce the amount of information to be transferred and processed. To exploit modern high-rate, high-resolution ToF image sensors we require a new paradigm for the processing of imaging information.
To overcome this challenge, researchers at the University of Edinburgh have developed a new technique which allows massive compression of single-photon lidar data without any significant loss of information. Fundamentally, the Edinburgh approach completely bypasses the need for the on-chip histogram generation which gives rise to data and computational bottlenecks. The technique builds on recent advances in compressive learning – sampling the characteristic function of the ToF model to build a compressive statistic of the time delay distribution, from which spatial distance and intensity of the object can be determined. The statistics are updated with each photon arrival with minimal computational overhead, eliminating the need to construct a histogram. Unlike conventional frameworks, here complexity scales independently of both photon count and depth resolution parameters and can be made effectively blind to photons originating from background sources. The technology can directly replace histogram generation in existing sensor technology and has been tested on real-life datasets of complex scenes where compression rates of 1/150 have been demonstrated without sacrificing the overall resolution of the reconstructed image.
Michael P. Sheehan, Julián Tachella, and Mike E. Davies, “A Sketching Framework for Reduced Data Transfer in Photon Counting Lidar”, https://arxiv.org/pdf/2102.08732.pdf
Please note, the header image is purely illustrative.