We are presenting at 3 CPS-IoT Week sessions this week:
IPSN’22: EyeSyn: Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition presents the first method for synthesizing eye movement data for training eye movement-based activity classifiers for AR and VR without human involvement. [PDF] [Code and data] [NSF Discoveries news item covering this work]
Heartfelt congratulations to Alex Hu for receiving the Spring 2022 ECE Independent Study Best Poster Award, for his poster titled “Impact of the Environment on Eye Tracking Efficacy in Headset Augmented Reality“. The poster was prepared as part of Alex’s year-long Duke ECE undergraduate Honors Thesis study titled “Eye Tracking for User and Environment-Aware Headset Augmented Reality“.
Two papers from the group are set to be presented at workshops co-located with IEEE VR 2022. Both papers are centered on end applications of next-generation augmented reality (AR) techniques we are currently developing in our lab. Continue reading
9 ECE and CS independent undergraduate research projects on next-generation AR and VR were completed in the lab over this semester. This work is supported in part by NSF grants CSR-1903136, CNS-1908051, and CAREER-2046072, and by an IBM Faculty Award.
Experimental setup (left), and the view of holograms overlaid in AR (right)
Seijung worked on improving the AR-assisted surgery platform we have been developing. The current platform runs with two world coordinates from OptiTrack and HoloLens 2, thus requires calibration to convert one coordinate system to another. She designed and built a 3D printed cube with ARToolKit marker to successfully calibrate the two world coordinate systems between OptiTrack and HoloLens 2. Then she worked on enhancing the robustness of optical marker tracking, for stationary and mobile objects. Compared to a stationary phantom model, a surgical tool is a mobile object during surgeries and the tracking suffers from false positives and noise of optical markers. Seijung explored various designs of 3D printed mount for a surgical tool, to determine the optimal configuration in terms of the number of markers, the distance between each marker, and the direction of marker attachments. She will continue working on integrating more contextual guidance into the system and achieving the level of robustness and accuracy of the system for our user studies in clinical settings.
Duke University I^3T Lab has multiple openings for PhD students and postdocs.
We work in the area of pervasive mobile and sensing systems broadly, and pervasive mobile Augmented Reality (AR) and next-generation intelligence for the Internet of Things in particular. Our work is generously supported by the National Science Foundation (NSF), the Lord Foundation of North Carolina, IBM, Facebook, and DARPA. We are also a part of the recently established NSF AI Institute for Edge Computing Leveraging Next Generation Networks, where our work is focused on building next-generation AI-powered mobile augmented reality.
If you are interested in joining the lab as a PhD student, please e-mail professor Gorlatova (maria.gorlatova /at/ duke.edu) your CV, transcripts, and a brief note about your research interests. We have openings for both CS and ECE PhD students. Strong candidates for PhD studies generally have undergraduate GPA above 3.6/4 and have experience either conducting research or developing advanced technical solutions outside of classroom settings (e.g., in independent studies, internships, employment, as extra-curricular projects). We have openings for PhD students with start dates in August 2022.
If you are interested in joining the lab as a postdoc, please e-mail professor Gorlatova (maria.gorlatova /at/ duke.edu) your CV, a brief note about your research interests, and 1-3 papers that you believe represent your best work to date. Candidates for postdoctoral positions need to have a track record of publishing their work in top venues of the field, and need to be self-driven and independent. The postdoctoral positions’ start dates are flexible, and can be as early as January 2022 and as late as August 2022. Lab’s previous postdoctoral affiliate Dr. Guohao Lan has successfully secured an independent Assistant Professor position at a top university.
This summer we were fortunate to be able to virtually host 3 Research Experience for Undergraduates (REU) students in the I^3T Lab, through the Duke University REU Site for Meeting Grand Challenges in Engineering. The research students were engaged in is supported in part by NSF grants CSR-1903136, CNS-1908051, and CAREER-2046072, and by an IBM Faculty Award. Continue reading
Delightful news today: Apple is planning to build a brand new campus here in Durham, spending 1 bln dollars over 10 years, and creating 3,000 highly skilled jobs, in particularly in AI, ML, and software engineering.
Exciting news for the region as a whole, and for the tech scene in particular, especially following Google’s announcement, only 6 weeks ago, of creating a cloud computing hub with over 1,000 jobs here in Durham as well. Very exciting news for myself as a computer systems faculty. So many opportunities for discussions and collaborations. So many new invited speakers and seminar attendees. So many local opportunities for student internships and full-time positions. Good news all around.
It was a true pleasure to attend the 2021 National Academy of Engineering Frontiers of Engineering Symposium (NAE FOE).
The NAE FOE is unique in bringing together engineers from all engineering disciplines. It was a blast. A rare opportunity to step back and learn about challenges in a wide variety of engineering areas, and to think about how one’s own work fits with the broader view of engineering as a profession. I kept thinking about the Iron Ring on my finger, and back to the ceremony, the Ritual of the Calling of an Engineer, where my graduating University of Ottawa class received these rings. I am thrilled that my professional journey has taken me from that ceremony to a Symposium dedicated to the frontiers of the profession.
Iron Ring is a reminder of the professional commitment of the engineer.
7 ECE and CS independent undergraduate research projects have been completed in the I^3T lab over the Fall of 2020. The projects are summarized below. This work is supported in part by NSF grants CSR-1903136 and CNS-1908051, IBM Faculty Award, and by the Lord Foundation of North Carolina.
Evaluating Object Detection Models through Photo-Realistic Synthetic Scenes in Game Engines
Achintya Kumar and Brianna Butler
We build an automatic pipeline to evaluate object recognition algorithms within the generated photo-realistic 3D scenes in game engines, i.e., Unity (with High Definition Render Pipeline) and Unreal. Specifically, we test the detection accuracy and intersection over union (IoU) under conditions of different lighting, reflections and transparency levels, camera or object rotations, blurring, occlusions, and object textures. In the automatic pipeline, we collect a large-scale dataset under these conditions without manual capturing, by controlling multiple parameters in game engines. For example, we control illumination conditions by changing the lux values and types of the light sources; we control reflections and transparency levels by using the custom render pipelines; and we control the texture by collecting a texture library and then randomly choosing the object texture from the texture library. Another important component of the automatic pipeline is to generate per-pixel ground truth, where the RGB value of the ground truth image indicating the corresponding object ID of each pixel. With the ground truth generation, the detection accuracy and IoU are obtained without manual labeling.