Home AI News Purdue University Researchers Revolutionize Robotics and Autonomy with HADAR Technology

Purdue University Researchers Revolutionize Robotics and Autonomy with HADAR Technology

0
Purdue University Researchers Revolutionize Robotics and Autonomy with HADAR Technology

Advancing Robotics and Autonomy with HADAR: Purdue University’s Breakthrough

Researchers at Purdue University have developed an innovative method called HADAR (heat-assisted detection and ranging) that revolutionizes the field of robotics and autonomy. Led by Zubin Jacob, the Elmore Associate Professor of Electrical and Computer Engineering, and research scientist Fanglin Bao, this patent-pending technology improves upon traditional machine vision and perception. Their groundbreaking research on HADAR was featured in the renowned peer-reviewed journal Nature.

The Significance of HADAR

The future is expected to bring major advancements in automation, with one in every ten vehicles becoming automated by 2030. Additionally, there will be a staggering 20 million robot helpers that assist people. To enable these automated systems to function effectively without human intervention, they need to gather information about their surroundings. However, the simultaneous perception of scenes by numerous agents is fundamentally challenging.

Traditional active sensors like LiDAR, radar, and sonar emit signals to collect 3D data about a scene. While these methods are advantageous, they also have drawbacks, such as signal interference and risks to eye safety. On the other hand, video cameras that rely on sunlight or other sources of illumination are advantageous in certain scenarios. However, low-light conditions like nighttime, fog, or rain pose significant obstacles.

Traditional thermal imaging, which collects invisible heat radiation from all objects in a scene, is a fully passive sensing method. It can sense through darkness, inclement weather, and solar glare. However, it faces challenges due to the “ghosting effect,” where objects and their environment emit and scatter thermal radiation, resulting in textureless images. Thermal pictures lack features and texture, making it difficult for machines to perceive using heat radiation alone.

HADAR: Paving the Way for Fully Passive Machine Perception

HADAR combines thermal physics, infrared imaging, and machine learning to overcome the limitations of traditional thermal imaging. It enables fully passive and physics-aware machine perception.

Zubin Jacob explains, “Our work establishes the information theoretic foundations of thermal perception, demonstrating that pitch darkness provides the same amount of information as broad daylight. Machine perception of the future will eliminate the traditional distinction between day and night.”

HADAR vividly recovers texture from cluttered heat signals and accurately disentangles temperature, emissivity, and texture (TeX) of objects in a scene. It can see texture and depth even in pitch darkness, surpassing the capabilities of RGB imaging and conventional thermal sensing.

During testing, HADAR’s TeX vision successfully recovered fine textures, such as water ripples, bark wrinkles, and culverts, surpassing the ghosting effect typically experienced with thermal imaging. The technology holds immense promise for applications in automated vehicles and robots that interact with humans in complex environments.

The Future of HADAR

HADAR TeX vision has promising applications in various sectors, including agriculture, defense, geosciences, healthcare, and wildlife monitoring. To further enhance HADAR, the research team is working on improving the size of the hardware and the data collection speed. Currently, the sensors are bulky and slow, taking around one second to create an image. However, for autonomous cars, a frame rate of 30 to 60 hertz is crucial.

Jacob and Bao have disclosed HADAR TeX to the Purdue Innovates Office of Technology Commercialization, which has applied for a patent on this pioneering intellectual property. Interested industry partners should reach out to Dipak Narula.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here