About

I am a PhD student in the Computation and Neural Systems (CNS) program at Caltech. I work in the lab of Professor Joe Parker on monitoring ants in the Angeles National Forest using custom in-house field cameras and computer vision. Prior to this, I worked with Professor Michael Dickinson on the flight and gaze stabilization systems of the fruit fly. My research interests involve conservation technology, computer vision, and animal behavior. Before Caltech, I completed my undergraduate in Computer Science and Engineering at PES Institute of Technology, Bangalore. I also worked at Brown University as a research assistant for a year with Professor Thomas Serre.

Ongoing projects

Haltere mediated equilibrium reflex

One of the key requirements for successful flight is the ability to rapidly stabilize to correct for external pertubations. The halteres are structures that are unique to flies and are known to play a critical role in this process. The halteres are hypothesized to detect coriolis forces via multiple fields of strain sensing cells. Using genetic tools, I am currently working on understanding the role of these fields for wing and head stabilization in fruit flies.

Monitoring of ants in the wild

Ants play a huge role in shaping the ecosystems around us. Here in Southern California, the velvety tree ant through its numerous inter-species interactions and its influence on soil, is a keystone species. However, very little is known about this ecologically dominant ant in the wild. In this ongoing project in collaboration with Sara Beery, Julian Wagner and Joe Parker, we use an in-house created camera trap (made by Will Dickson) positioned at the entrance of an ant nest, along with state of the art computer vision methods to detect, track and monitor activity using minimal human labeling. We tracked ant activity frequently across multiple days and studied the influence of environmental factors such as temperature, humidity, time of day on activity levels. We are in the process of expanding monitoring to multiple ant nests and across night time conditions.

Bioacoustics analysis of bumblebees

Bumblebees are very important pollinators for a wide range of flowering plants and agricultural crops. Some species of bumblebees are facing rapid declines in population numbers and this could have far ranging ecological consequences. Bioacoustics data can provide a wealth of information and serve as a cheap monitoring tool. In this ongoing project in collaboration with Alixandra Prybyla from Edinburgh Bumbleebee Brigade, we are working on using machine learning tools on high quality recordings of bumblebees in the wild, in order to gain insights into species distribution.

Publications

Making Use of Unlabeled Data: Comparing Strategies for Marine Animal Detection in Long-tailed Datasets Using Self-supervised and Semi-supervised Pre-training

Tarun Sharma, Danelle E. Cline, Duane Edgington
CVPR Workshop Proceedings 2024
[paper]

Monitoring Social Insect Activity with Minimal Human Supervision

Tarun Sharma , Julian M. Wagner, Sara Beery, William B. Dickson, Michael H. Dickinson, Joseph Parker
CVPR Workshop Proceedings 2024
[paper]

Teaching Computer Vision for Ecology

Elijah Cole, Suzanne Stathatos, Björn Lütjens, Tarun Sharma, Justin Kay, Jason Parham, Benjamin Kellenberger, Sara Beery.
Arxiv
[paper]

Using computational analysis of behavior to discover developmental change in memory-guided attention mechanisms in childhood

Dima Amso, Lakshmi Govindarajan, Pankaj Gupta, Diego Placido, Heidi Baumgartner, Andrew Lynn, Kelley Gunther, Tarun Sharma , Vijay Veerabadran, Kalpit Thakkar, Seung Chan Kim, Thomas Serre
PsyArXiv 2021
[paper]

Diverse Food-Sensing Neurons Trigger Idiothetic Local Search in Drosophila

Roman Corfas, Tarun Sharma, Michael Dickinson
Current Biology 2019
[paper]

Thematic segmentation of long content using deep learning and contextual cues

Jayananda Appanna Kotri, Tarun Sharma , Sharad Kejriwal, Yashwanth Dasari, Abinaya S
US Patent with SAP SE
[patent]

Neural Computing on a Raspberry Pi: Applications to Zebrafish Behavior Monitoring

Lakshmi Narasimhan Govindarajan, Tarun Sharma, Ruth Colwill, Thomas Serre
VAIB 2018
[paper]

What are the visual features underlying human versus machine vision?

D. Linsley, S. Eberhardt, T. Sharma, P. Gupta, T. Serre
ICCV 2017
[paper]

Learning to predict action potentials end-to-end from calcium imaging data

Drew Linsley, Jeremy W. Linsley, Tarun Sharma , Nathan Myers, Thomas Serre
IEEE CISS 2018
[paper]

NAVI: Navigation aid for the visually impaired

Tarun Sharma, J.H.M Apoorva, Ramananathan Lakshmanan, Prakruti Gogia, Manoj Kondapaka
IEEE ICCCA 2016
[paper]

Towards Quantifying the Amount of Uncollected Garbage through Image Analysis

Susheel Suresh, Tarun Sharma, Prashanth T.K., Subramaniam V, Dinkar Sitaram, Nirupama M
ICVGIP 2016
[paper]

Other projects

Unsupervised behavior detection in the praying mantis

Project for the Chen Innovator grant. Collected videos of preying mantids in lab. Tracked appendages using deeplabcut and worked on unsupervised detection of behavior using spectogram analysis, wavelet transforms, clustering and watershed algorithm. Compared behavior of adult vs juvenile mantids. This project was in collaboration with Annie Erickson.
[code]

Fly headtracker using 3D model

I used mesolens scans of adult Drosophila by Professor Gail McConnell and cropped and reduced the head in order to fit it in a rendering software (Blender). I then generated a dataset of rendered images from the top with the head at various orientations. The aim of this project was to see if it is possible to obtain 3D head information from a 2D view of a fly head. I compared two approaches using the rendered images along with the corresponding head coordinates, one where I trained a neural network to predict change in head position based on the optic flow maps calculated from the Nth frame and the first frame (assumes first frame is at 0,0,0), and another where a neural network would take consequetive frames and try to predict the change in head position. Ultimately both approaches showed promise for the rendered dataset but did not generalize to real world conditions because of lack of 3D ground truth.

Analysis of deep neural networks for 3D scene representation

This project was done in collaboration with Erin Koch, Benyamin Haghi, Michaelangelo Caporale and Guruprasad Raghavan. The aim of this project was to gain insights into how neurons in the brain might encode object properties, by examining the representations in 3D scene encoding deep neural networks. We used the GQN model by DeepMind, which is able to take in multiple views of a scene and render the viewpoint of the scene from a query location. We used a range of techniques on the encoded scene representations such as dimensionality reduction, tuning curves, pertubations of individual units and correlation between units to identify encoded properties. Arxiv manuscript in preparation.

Real time video content based contextual advertisement generator

Developed a web based framework that displays relevant ads to users in real time, based on the content of the video the user is watching. Used Google Speech to Text API, Alchemy API to parse the text and retrieve brand names and then used a web scrapper to fetch and display relevant ads.
[code][paper]

Hands free cake book

Published a Windows Store Desktop app which uses hand gesture recognition to allow the user to navigate through steps of baking a cake without touching the computer. The app currently has 30,000 plus downloads.
[App]

Teaching

Computer Vision for Ecology Summer School

Teaching Assistant at Computer Vision for Ecology Summer School 2022 (https://cv4ecology.caltech.edu/). This intensive, three-week program was aimed at teaching applied computer vision methods to senior ecology graduate students and postdocs. Students developed hands-on computer vision systems to help answer their own ecological research questions, using their own data. As part of the core instructor team, I worked closely with multiple students providing daily mentorship and hands-on assistance on their projects. I also gave lectures on data annotations tools and unsupervised and self supervised learning.

BE/Bi 106: Comparative Biomechanics

Teaching Assistant for the 2021 9 unit course on Comparative Biomechanics at Caltech. This course focused on how engineering principles of solid and fluid mechanics can be applied to the study of biological systems. The course drew on a wide array of biological phenomena from plants and animals. The course was taught by Michael Dickinson.

Presentations

  • "Machine Learning for Conservation". Invited talk to students of HumaniTech class at Georgia Tech on the applications, benefits and challenges in using machine learning tools to tackle environmental challenges.
  • "Monitoring Social Insect Activity with Minimal Human Supervision". Oral presentation at Doctoral Consortium on Computational Sustainability (CompSust) 2022, with Julian Wagner on our research on monitoring ants in the wild using computer vision.
  • "Naturalistic behavior repertoires of the praying mantis". Oral presentation at Chen Graduate Innovator Symposium, with Annie Erickson on our work of unsupervised detection of behavior in praying mantids.
  • "Towards Quantifying the Amount of Uncollected Garbage through Image Analysis". Poster presentation at the Tenth Indian Conference on Computer Vision, Graphics and Image Processing in 2016.

CV

Download my full CV here.