Logo of Huzzle


Student Researcher, Fundamental Language - 2024 Start (PhD) - Seattle

Logo of TikTok



1mo ago

πŸš€ Off-cycle Internship


βŒ› Closed
Applications are closed

Off-cycle Internship

Research & Development, Dataβ€’Seattle


  • This position is responsible for researching and building the company's LLMs. The role involves exploring new applications and solutions for related technologies in areas such as search, recommendation, advertising, content creation, and customer service. The goal is to meet the increasing demand for intelligent interactions from users and to significantly enhance their lifestyle and communication in the future.
  • We are looking for talented individuals to join us for a Student Researcher opportunity in 2024. Student Researcher opportunities at TikTok aim to offer students industry exposure and hands-on experience. Turn your ambitions into reality as your inspiration brings infinite opportunities at TikTok.
  • The Student Researcher position provides unique opportunities that go beyond the constraints of our standard internship program, allowing for flexibility in duration, time commitment, and location of work.
  • Candidates can apply to a maximum of two positions and will be considered for jobs in the order you apply. The application limit is applicable to TikTok and its affiliates' jobs globally. Applications will be reviewed on a rolling basis - we encourage you to apply early.


  • Currently enrolled in a PhD degree in Computer Science, Linguistics, Statistics, or related technical field.
  • Excellent knowledge of theory and practice of LLM and foundation model, as well as deep learning.
  • Strong publication record at leading conferences (ACL, EMNLP, NeurIPS, ICML etc.).
  • Excellent coding ability, familiar with data structures, and fundamental algorithm skills, proficient in C/C++ or Python, winners of competitions such as ACM/ICPC, USACO/NOI/IOI, Top Coder, Kaggle, etc. are preferred;
  • Good communication and collaboration skills, able to explore new technologies with the team and promote technological progress.
  • Preferred Qualifications:
  • Graduating December 2024 onwards with intent to return to degree-program after the completion of the internship.
  • Demonstrated software engineering or natural language processing, deep learning experience from previous internship, work experience, coding competitions, or publications.
  • High levels of creativity and quick problem-solving capabilities.

Education requirements

Currently Studying

Area of Responsibilities

Research & Development


  • LLM reasoning and planning. Enhance LLM reasoning and planning throughout the entire development process, encompassing data acquisition, model evaluation, pretraining, SFT, reward modeling, and reinforcement learning, to bolster overall LLM performance.
  • Synthesize large-scale, high-quality data through methods such as rewriting, augmentation, and generation, to improve the abilities of LLMs in various stages (pretraining, SFT, RLHF).
  • Investigating and implementing robust evaluation methodologies to assess the performance of LLMs at various stages, unraveling the underlying mechanisms and sources of their abilities, and utilizing this understanding to drive model improvements.


Work type

Full time

Work mode