Logo of Huzzle

Internship

Data Engineering Intern

Logo of Copart

Copart

•

1mo ago

🚀 Off-cycle Internship

Dallas

AI generated summary

  • You need a Bachelor's degree in CS or similar field, 1+ years of experience in data engineering, knowledge of real-time data pipelines, BI tools, ETL, and databases like SQL, Oracle, and more. Good communication skills and proficiency in SQL and Python are a must.
  • You will design, build and automate data processing systems, collaborate with teams, monitor processes, and ensure data integrity in production systems.

Off-cycle Internship

Data, Software Engineering•Dallas

Description

  • The Data Engineer Intern will be part of the Data Engineering Team. The Data Engineering team works very closely with all aspects of applications and data pipelines. We are looking for a Data Engineer Intern to assist in design, development, and optimizing the flow of data throughout the organization, enabling end user to provide valuable insights of data across Copart. In this role, your work will broadly influence the company's data consumers, executives and analysts.

Requirements

  • Bachelor's degree or higher in computer science, Engineering or similar
  • 1+ years of experience designing, developing, testing and implementing scalable, high-performing data warehouse and BI solutions
  • Good hands-on Experience on real-time data pipelines including Kinesis and Kafka , understanding of Database architecture including MPP and ad-hoc analysis using BI/Analytical tools like Tableau, Pentaho, OBIEE or Power BI
  • Proven ability to analyze complex business problems using data and translate them into actionable insights stemming from data analysis
  • Experience with ETL and Data Blending and Transformation tools such as Pentaho Data Integrator, Talend Data Integration, Informatica
  • Enterprise development knowledge and/or experience with databases such as SQL Server, Oracle, MySQL and Columnar databases like Vertica, Snowflake , MemSQL , Netezza , Redshift
  • Good understanding of technology and industry and able to make decisions on the best technology for integrated solutions
  • Proven ability to communicate with business and technical audiences at all levels, including demonstrated success influencing senior leaders and decision makers
  • Technical Skill Required: SQL, Python. BI/Analytical Tool Preferred: Tableau, Pentaho, PowerBI

Education requirements

Currently Studying
Bachelors

Area of Responsibilities

Data
Software Engineering

Responsibilities

  • Design and build the next generation data platform
  • Develop and automate data processing systems to deliver data insights at an enterprise scale.
  • Develop logging, metrics, and alerts that enable active monitoring of designed processes.
  • Collaborate with Product Managers and Application teams to develop data models and schemas that help provide easy access to complex data sets.
  • Assist in maintaining data integrity in production systems
  • Ability to balance and prioritize multiple conflicting requirements with high attention to detail.

Details

Work type

Full time

Work mode

office

Location

Dallas