SanFranRecruiter Since 2001
the smart solution for San Francisco jobs

Data Engineer

Company: JobRialto
Location: San Francisco
Posted on: June 11, 2024

Job Description:

Description:

Designs, develops, and implements Hadoop eco-system based applications to support business requirements.

Follows approved life cycle methodologies, creates design documents, and performs program coding and testing. Resolves technical issues through debugging, research, and investigation.


Experience/Skills Required:

Bachelor's degree in Computer Science, Information Technology, or related field and 5 years experience in computer programming, software development or related

3+ years of solid Java and 2+ years experience in design, implementation, and support of solutions big data solution in Hadoop using Hive, Spark, Drill, Impala, HBase

Hands on experience with Unix, Teradata and other relational databases.

Experience with @Scale a plus.

Strong communication and problem-solving skills

Requires knowledge of Analytics/big data analytics / automation techniques and methods; Business understanding; Precedence and use cases; Business requirements and insights. To translate/ co-own business problems within one's discipline to data related or mathematical solutions.

Identify appropriate methods/tools to be leveraged to provide a solution for the problem. Share use cases and gives examples to demonstrate how the method would solve the business problem

Requires knowledge of understanding of business value and relevance of data and data enabled insights / decisions; Appropriate application and understanding of data ecosystem including Data Management, Data Quality Standards and Data Governance, Accessibility, Storage and Scalability, etc.;

Understanding of the methods and applications that unlock the monetary value of data assets.

To Understand, articulate, interpret, and apply the principles of the defined strategy to unique, moderately complex business problems that may span one or main functions or domains

Requires knowledge of Data quality management techniques and standards; Business metadata definitions and content data definitions; Data profiling tools, data cleansing tools, data integration tools, and issues and event management tools; Understanding of user's data consumption, data needs, and business implications; Data modeling, storage, integration, and warehousing; Data quality framework and metrics; User access best practices; Enterprise data architecture, modeling and design, storage, integration, and warehousing; Enterprise data quality framework and metrics; Enterprise data strategy; Enterprise data quality strategy; Enterprise strategy to address regulatory and ethical requirements and policies around data privacy, security, storage, retention, and documentation.

To promote and educate others on data quality awareness.

Profile, analyze, and assess data quality.

Test and validate data quality requirements.

Continuously measure and monitor data quality.
Manage operational Data Quality Management procedures.
Manage data quality issues and leads data cleansing activities to remove data quality defects, improve data quality, and eliminate unused data.
Determine user accessibility and removes or restricts user access as needed.

Interpret company and regulatory policies on data.

Educate others on data governance processes, practices, policies, and guidelines.

Requires knowledge of relevant Knowledge Discovery in Data (KDD) tools, applications, or scripting languages such as SQL, Oracle, Apache Mahout, MS Excel, Python; Statistical techniques (for example, mean, mode, median, variance, standard deviation, correlation, and sorting and grouping); Research analysis standards and activities; Documentation procedures such as drafting, editing, Bibliography format; Relevant Knowledge Discovery in Data (KDD) tools, applications, or scripting languages such as SQL, DB, SAS, Oracle, Apache Mahout, MS Excel, Python; KDD industry best practices and emerging trends.

To collect and tabulate data and evaluate results to determine accuracy, validity, and applicability.

Support the identification and application of statistical techniques based on requirements. Apply suitable technique under direction from leadership.

Assist in the planning, design and implementation of an exploratory data analysis research projects.

Understand existing statistical models and identify and recommend statistical models based on hypothesis.

Use advanced Knowledge in Data Discovery tools to write queries and analyze data to identify patterns, trends, outliers, and correlations.

Conduct statistical analysis (for example hypothesis tests, confidence intervals) and build basic statistical models using relevant packages/software suites.

Education: Bachelors Degree

Additional client information:

Keywords: JobRialto, San Francisco , Data Engineer, Engineering , San Francisco, California

Click here to apply!

Didn't find what you're looking for? Search again!

I'm looking for
in category
within


Log In or Create An Account

Get the latest California jobs by following @recnetCA on Twitter!

San Francisco RSS job feeds