Big Data Software Engineer
Company: CSC Holdings, LLC
Location: San Franciso, CA
Posted on: November 5, 2015
Job Description:
Create the next generation of software applications for Big
Data Platform using open source technologies and cloud
infrastructure. Design and develop ETL pipeline to move data into distributed storage
(HBase, HDFS) in real time and batch mode. Analyze performance,
troubleshoot and debug to appropriately identify and correct issues. Build
predictive models that optimize operational processes, anticipate customer
needs, and create a competitive advantage for the organization. Apply
a systematic approach to problem-solving using mathematical
and statistical techniques and quantitative analytical
approaches. Requirements: Master's degree in Computer Science, Engineering or Information
Systems or related field (willing to accept foreign education
equivalent) plus 3 years of experience performing design and implementation of big
data ETL. Or, in the alternative: a Bachelor's degree and 5 years
of experience as stated above. Also requires: 1) Demonstrated Expertise (DE) performing core Java
development, including building distributed systems using Hadoop, MapReduce
and HDFS; 2) DE performing SQL programming on database management systems
like Oracle, MySQL; 3) DE performing HBase and Hive queries for storing and
retrieving data; 4) DE performing data analysis using Python and R; 5) DE utilizing SVN, Ant, Maven and Jira to perform project
management; and, 6) DE working with medium to large size Hadoop clusters (with
50+ nodes) and systems with 100+ terabytes, and developing complete
architecture of a Hadoop ecosystem with 50+ servers. Submit resume to recruit@cablevision.com and include job title
and “Job Code GCKS” in the subject line.
Keywords: CSC Holdings, LLC, San Francisco , Big Data Software Engineer , Other , San Franciso, CA, California
Click
here to apply!
|