CRUNCH Group, Providence, RI
I drove down to Providence in late May. I had just finished exams but was still training for crew nationals on campus. I walked inside the Brown Applied Math building and snuck into the back of a packed classroom, where a guest lecturer was presenting about optimization on loss manifolds; I had no idea what was going on. The seminar ended, and my supervisor, Dr. Karniadakis, introduced himself before whisking me away to the lab upstairs. We sat down in a conference room with my mentor, Vivek, a first year Ph.D. student, and I was given a crash course on the basics of deep learning. Armed with my first homework assignment—creating a model that could predict a piecewise periodic function—I returned to Williams that night. The next morning after practice I started coding, hopelessly lost.
It took a few weeks before I started to gain some confidence. The first month of my internship followed the same rhythm of Vivek introducing a new concept, working a few days on a homework assignment and then sending my final product as a report to Dr. Karniadakis. I began to learn more lab-specific skills culminating in PINNs (Physics-Informed Neural Nets), which use deep learning to solve differential equations. PINNs are revolutionary in their ability to combine observational data and the constraints of the physical world, allowing for robust predictions even with only a handful of measurements. Researchers in the lab apply this technology to building climate models or designing more efficient turbines. Once I had mastered this skill, I was surprised that I could now implement their models from scratch.
In addition to learning new concepts, I began making modifications to existing frameworks and observed how my changes affected performance. I focused on self-adaptive machine learning methods—building models that could learn how to learn. Depending on the application and the quality of the observational data, the relative importance of each loss term can vary significantly, and most of my work revolved around developing a way to quantify the performance of the model as a function of the change in loss term weights.
At the end of my internship, I presented my work during the lab’s weekly CRUNCH seminar. I shared the highlights of what I learned and presented my novel loss weights balancing scheme. After the presentation, Dr. Karniadakis offered me a position in the lab as a Ph.D. student. This acknowledgement of my work was a gratifying moment, as a lot of the summer I worked independently and was unsure I was building anything worthwhile. I enjoyed working in a high-energy lab and have already started using my new deep-learning skills in my computational biology thesis. I now want to pursue a Ph.D. and continue working on challenging open-ended projects. I am grateful for this opportunity to both grow academically and figure out how I want to proceed post-graduation.
Megamenu Social