Another Wavelength: Joseph Cox

    Date Posted: 
    Tuesday, December 1, 2020

    This month in Another Wavelength, we chat with 4th year Ph.D. Student Joseph Cox. Joe is currently mentored by Associate Professor Amit Ashok.

    Where are you from?

    Ridgecrest, California. It’s a small town in the Mojave Desert.

    What brought you to study optics?

    I started out as an electrical engineer, working for the Navy on radar imaging. I was working on a technique called electronic warfare, which jams radars and stops them from detecting targets. I realized that this technique was getting really good, possibly making radar obsolete. I concluded that optics is the future of detecting targets, and that I should learn how to do optics. So, I applied to Optical Sciences, was accepted, and started the program. It turned out to be a great idea, because I’m now solving problems that I came here to learn how to solve.

    Who is your hero in science?

    My undergraduate advisor, Scott Teare. He convinced me that I should get a Ph.D. and introduced me to optics and imaging. He also introduced me to applied research, or what I view as research with the goal being to understand how a technology can enable or improve specific applications. This contrasts with basic research, where I view the goal as developing knowledge without demonstrating improvement in specific applications. While university research is mostly basic research, my project is mostly applied, and discussions with him convinced me that I could pursue this work for my dissertation.

    Describe your research in 20 words or fewer.

    I develop and evaluate imaging systems for target detection applications. I integrate new sensor technology, which leads to interesting questions.

    Describe your research in 200 words or fewer.

    I investigate a new type of imaging sensor, the Event-Based Sensor (EBS). These sensors are cameras with sophisticated electronics on the focal plane. These electronics only output when and where changes in irradiance are detected in the image. This means that the EBS only detects changes, which can reduce a camera’s read-out bandwidth. For instance, we can image moving objects with a stationary EBS. This outputs data representing the moving objects’ images, plus noise. Imaging from a moving, unstabilized EBS outputs data representing the imaged scene, plus noise.

    In many applications, including IR cameras, defense, and autonomous cars, read-out bandwidth can limit performance. Here, noise, temperature, and other factors limit electronic speeds. By reducing bandwidth requirements with EBS, we hope to enable the usage of better cameras, improving performance in target detection tasks.

    Our goal is to quantify both bandwidth reduction and detection performance of EBS. We are measuring these across different scenarios, searching for conditions where we can detect/recognize targets, with both EBS and conventional camera technology. If we can measure similar performance, along with reduced bandwidth, with EBS, we show potential in the technology. We expect bandwidth-limited applications such as those above to benefit significantly from positive results.

    Name three neat facts about you.

    1. I’ve lived in the southwest my whole life. While I enjoy it here, one day I’d like to move to the east coast.
    2. I used to play football in high school. I got a big interception in a game and was tackled wrong. This tore my ACL, ending my football career and leading me to pursue a STEM career. This turned out to be a good idea because I get to work on interesting problems with wonderful people.
    3. I am a Department of Defense SMART Scholar and will be working for the Air Force Research Laboratory in New Mexico after graduation.

    Research caption: I imaged a moving 3d-printed target with a complex background. These were simultaneously recorded with a moving, stabilized EBS and a moving, unstabilized EBS. The stabilization corresponds to a mostly static background and a moving background, respectively. The target is straightforward to detect with both cameras. However, the unstabilized EBS uses more bandwidth and adds background clutter, which can degrade task performance.