cv

Below is the condensed version of my CV. For the full CV please click on the Download PDF button.

Basics

Name Rishi Singhal
Label Graduate Research Assistant
Email rsingha4@ncsu.edu
Url https://rishi2019194.github.io
Summary Passionate researcher dedicated to advancing explainable and interpretable AI, with a primary research focus on the dynamics of memorization and generalization in deep neural networks across NLP and CV. Interested in methods to improve efficiency, robustness, safety, and privacy in neural networks.

Work

  • 2024.05 - 2024.08
    Machine Learning Intern
    Fermilab
    Worked on deploying and optimizing machine learning systems for neutrino experiments.
    • Deployed Graph Neural Networks (NuGraph2/3) on Fermilab’s EAF using Nvidia Triton & Docker.
    • Enabled real-time background filtering and semantic labeling for MicroBooNE.
    • Integrated Python/C++ client with LarSoft for direct streaming, reducing memory overhead by 20%.
    • Extended NuSonic Triton framework for scalability and maintainability.
    • Contributed production-level code adopted in Fermilab’s official reconstruction pipeline.
  • 2024.01 - Present
    Graduate Research Assistant
    Dr. Jung-Eun Kim Lab, North Carolina State University
    Conducting research on memorization and generalization in deep neural networks, with a focus on transformer architectures.
    • Discovered a novel role of LayerNorm in shaping memorization vs. generalization across Pre-LN and Post-LN models.
    • Verified findings on both generative and classification tasks across NLP and CV.
    • Showed that pruning only 0.1–0.2% of Post-LN parameters reduces memorization by ~70% without harming generalization.
    • Demonstrated that early LayerNorms exert the strongest influence compared to later ones.
    • Ongoing: Distinguishing memorization vs. generalization at the feature level and studying the impact of residual connections in large-scale LLMs (GPT, LLaMA).
  • 2022.01 - 2023.04
    Undergraduate Research Assistant
    MIDAS Lab, IIIT Delhi
    Conducted research on document coherence in NLP tasks.
    • Investigated coherence as a core metric for evaluating text quality in summarization, translation, and QA.
    • Applied Topological Data Analysis (TDA) on attention graphs of BERT, RoBERTa models.
    • Developed lightweight MLP leveraging TDA features, outperforming transformer baselines by 5% on GCDC dataset.

Education

  • 2025.01 - 2028.01

    Raleigh, USA

    PhD
    North Carolina State University
    Computer Science
  • 2023.01 - 2025.01

    Raleigh, USA

    Masters
    North Carolina State University
    Computer Science
  • 2019.01 - 2023.01

    Delhi, India

    BTech
    Indraprastha Institute of Information Technology (IIIT) Delhi
    Electronics and Communication Engineering

Publications

Skills

Programming & Tools
Python
C++
SQL
MATLAB
PyTorch
TensorFlow
Keras
Scikit-Learn
Numpy
Pandas
SpaCy
NLTK
Nvidia-Triton
MCP
Docker
Flask
Postman
Git

Projects