ETL Developer - Charlotte, NC
- This technical role will be responsible for
- Design high performing data models on big-data architecture as data services.
- Design and build high performing and scalable data pipeline platform using Hadoop, Apache Spark and Amazon S3 based object storage architecture.
- Partner with Enterprise data teams such as Data Management & Insights and Enterprise Data Environment (Data Lake) and identify the best place to source the data
- Work with business analysts, development teams and project managers for requirements and business rules.
- Collaborate with source system and approved provisioning point (APP) teams, Architects, Data Analysts and Modelers to build scalable and performant data solutions.
- Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist
- Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure.
- Work with DBAs in Enterprise Database Management group to troubleshoot problems and optimize performance
- Support ongoing data management efforts for Development, QA and Production environments
- 7+ years of application development and implementation experience
- 7+ years of experience delivering complex enterprise wide information technology solutions
- 7+ years of ETL (Extract, Transform, Load) Programming experience
- 7+ years of reporting experience, analytics experience or a combination of both
- 5+ years of Hadoop experience
- 5+ years of operational risk, conduct risk or compliance domain experience
- 5+ years of experience delivering ETL, data warehouse and data analytics capabilities on big-data architecture such as Hadoop
- 5+ years of Java or Python experience
- Excellent verbal, written, and interpersonal communication skills
- Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations
- Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects
- Knowledge and understanding of DevOps principles
- Ability to interact effectively and confidently with senior management
- Experience designing and developing data analytics solutions using object data stores such as S3
- Experience in Hadoop ecosystem tools for real-time batch data ingestion, processing and provisioning such as Apache Flume, Apache Kafka, Apache Sqoop, Apache Flink, Apache Spark or Apache Storm
United States of America
Online / Full App
Folder Group ID
Apply for this Position
Job Title: ETL Developer
Location: Charlotte, NC
Job Type: Temp Position
Reference ID: 363202
Posted Date: 4/9/2019