Logo of Huzzle

🚀 Internship

Current PhD, Applied Research Intern - Summer 2024

Logo of Capital One

Capital One

20d ago

🚀 Summer Internship

San Francisco +1

AI generated summary

  • The ideal candidate for the Applied Research Intern position at Capital One should be a creative and determined PhD student with a strong technical background in mathematics and deep learning theory. They should have programming experience in Python, PyTorch, C++, and other deep learning frameworks, as well as a track record of publications in leading conferences. Additionally, the candidate should have a focused area of research in one of the specified fields such as LLM pre-training, natural language processing, behavioral models, optimization, or large-scale data preparation.
  • The candidate will engage in applied research, collaborate with a cross-functional team, leverage various technologies, translate complex work into business goals, publish papers at academic conferences, and develop professionally through networking and executive speaker sessions.

Summer Internship

Software Engineering, EngineeringSan Francisco, New York


  • This is a paid internship. This is a limited-time internship position, and Capital One will not sponsor a new applicant for employment authorization for this position. However, a full-time
  • Applied Research role, for which you may be considered upon completion of the internship (subject to business need, market conditions, and other factors) is eligible for employer immigration sponsorship.


  • The Ideal Candidate:
  • You love the process of analyzing and creating, but also share our passion to do the right thing. You want to work on problems that will help change banking for good.
  • Innovative. You continually research and evaluate emerging technologies. You stay current on published state-of-the-art methods, technologies, and applications and seek out opportunities to apply them.
  • Creative. You thrive on bringing definition to big, undefined problems. You love asking questions and pushing hard to find answers. You’re not afraid to share a new idea.
  • Technical. You possess a strong foundation in mathematics, deep learning theory, and the engineering required for contributing to the development of AI.
  • Determined. Strengthen your field of study by applying theory to practice. Bring your ideas to life in industry.
  • Basic Qualifications:
  • Currently enrolled in an accredited PhD Program
  • Completed 2nd year of PhD coursework by program start date
  • Preferred Qualifications:
  • Completed 3rd or 4th year of PhD Program
  • PhD in Computer Science, Machine Learning, Computer Engineering, Applied Mathematics, Electrical Engineering or related fields
  • Programming experience in Python, PyTorch, C++, and other deep learning frameworks
  • Publications in leading conferences such as KDD, ICML, NeurIPs, ICLR, ACL, NAACL and EMNLP, or ICLR
  • Focused area of researchin one of the following areas:
  • LLM Pre-training
  • PhD focus on Natural Language Processing
  • Publications on topics related to the pre-training of large language models (e.g. technical reports of pre-trained LLMs, SSL techniques, model pre-training optimization)
  • Publications in deep learning theory
  • LLM Finetuning
  • Ph.D. focused on topics related to guiding LLMs with further tasks (Supervised Finetuning, Instruction-Tuning, Dialogue-Finetuning, Parameter Tuning)
  • Demonstrated knowledge of principles of transfer learning, model adaptation and model guidance
  • Behavioral Models
  • PhD focus on topics in geometric deep learning (Graph Neural Networks, Sequential Models, Multivariate Time Series)
  • Contributions to common open source frameworks (pytorch-geometric, DGL)
  • Proposed new methods for inference or representation learning on graphs or sequences
  • Optimization (Training & Inference)
  • PhD focused on topics related to optimizing training of very large deep learning models
  • Multiple years of experience and/or publications on one of the following topics: Model Sparsification, Quantization, Training Parallelism/Partitioning Design, Gradient Checkpointing, Model Compression
  • Deep knowledge of deep learning algorithmic and/or optimizer design
  • Large Scale Data Preparation
  • Publications studying tokenization, data quality, dataset curation, or labeling

Education requirements

Currently Studying

Area of Responsibilities

Software Engineering


  • Join Capital One for a full-time, 12 week, summer applied research experience, discovering solutions to real world, large-scale problems.
  • Engage in high impact applied research with the goal of taking the latest AI developments and pushing them into the next generation of customer experiences, or contributing to publications in this field.
  • Partner with a cross-functional team of applied researchers, data scientists, software engineers, machine learning engineers and product managers to test and design AI- powered products that change how customers interact with their money.
  • Leverage a broad stack of technologies — Pytorch, AWS Ultraclusters, Huggingface, Lightning, VectorDBs, and more — to reveal the insights hidden within huge volumes of numeric and textual data.
  • Flex your interpersonal skills to translate the complexity of your work into tangible business goals.
  • Partner with leading researchers to publish papers at top academic conferences.
  • Develop Professionally through networking sessions, technical deep dives and executive speaker sessions from across Capital One.


Work type

Full time

Work mode



San Francisco, New York