IIIT-Hyderabad Student Wins Indian Navy Innovation Award

Update: 2024-12-12 16:35 GMT
Rishabh Bhattacharya secures Rs 3 lakh for his sub-pixel accuracy algorithm tackling navigation and tracking challenges at Swavalamban 2024. (Image by arrangement)

HYDERABAD: Rishabh Bhattacharya, a third-year student at IIIT-Hyderabad, has won the first prize at an Indian Navy event for his algorithm which enhances navigation and real-time tracking of flying objects like drones.

The prize, with a cash award of Rs 3 lakh, was announced at Swavalamban 2024, the Indian Navy’s innovation and indigenisation seminar for his optical flow tracking algorithm capable of achieving sub-pixel accuracy.

His solution also demonstrated the ability to avoid environmental challenges like poor lighting, rapid movements and complex textures.

Held in October, the Swavalamban seminar hosted a nationwide competition inviting participants to tackle operational challenges using technological innovation. The competition presented problem statements ranging from swarm drone coordination and maritime situational awareness to navigation and tracking of flying objects.

Bhattacharya opted for the latter, inspired by his research presented at the IEEE International Conference on Robotics and Automation (ICRA) 2023.

"One of the criteria laid out was for the solution to demonstrate resilience to varying lighting conditions, rapid movements and complex textures while maintaining efficiency on platforms like drones or embedded systems," said Bhattacharya.

His algorithm was built for sub-pixel accuracy to ensure fine-grained motion estimation and tracking. He said achieving this level of precision was challenging due to the unpredictable movements of flying objects.

"Tracking flying objects introduces complexities due to their rapid and unpredictable movements, necessitating advanced detection and tracking mechanisms that can operate seamlessly in real-time," he explained.

To address the challenge of lack of dataset or information, Bhattacharya combined the flying objects dataset from Sekilab, which includes planes, helicopters, and birds, with a UAV dataset available on Kaggle. He created a synthetic dataset using semantic separation techniques to isolate and move individual objects across frames to simulate diverse motion scenarios.

"The combined dataset, totalling 7.7 gigabytes, is slated for public release to benefit the broader research community," Bhattacharya informed.

To improve the performance of the algorithm under challenging environmental conditions, Bhattacharya integrated the GDIP framework he had developed earlier. GDIP, which stands for Gated Differential Image Processing, enhances object detection models such as YOLOv8 by making them more effective in fog and low lighting.

The model was trained using the combined dataset over 50 epochs and optimised for real-time application, processing each frame in approximately two milliseconds. Testing demonstrated its reliability under varying lighting, complex textures and unpredictable movements.

"The various things that I worked on in the Machine Learning Lab under the guidance of Dr Naresh Manwani gave me exposure to different ideas, some of which I used in the hackathon," Bhattacharya said.

He recalled discussions on a research paper during a lab project that later informed his final solution. The seminar brought Bhattacharya face-to-face with Navy admirals and commanders, who expressed interest in integrating his solution into operational frameworks. "Meeting Navy officials who appreciated and discussed my work was an inspiring moment," he stated.



Tags:    

Similar News