Johnson & Johnson Enterprise Supply Chain’s Digital & Analytics group requires qualified, skilled and experienced data engineers who can handle large volumes of data stored in different storage systems in the supply chain data management team. Suitable candidates should be able to identify the required data for any given operation and build a pipeline for extracting the data and identify computing resource needs to process them. This role is to be based in Bridgewater, NJ.
We are looking for hardworking data engineering subject matter expert with good working experience on databases. The ideal candidate will have a sound experience on large data sets, ability to architect storage of large data sets, hands-on in writing SQL or equivalent queries to identify and extract data, use scripts or equivalent services to extract large amount of data in real time from multiple sources to build a data pipeline, excellent communication skills, proven track record of leadership, comfortable in interfacing with different stakeholders and different teams/groups for project purpose, a passionate change agent and should constantly keep pace with industry trends and emerging technologies. Candidate should have a passion for conceptualization, prototyping, & developing data architecture, to enable utilization of such data for advance analytics-based products, applications & solutions.
- Work with business teams to understand the problem statement, identify the different stake holders.
- Ability to work with different teams/groups within/outside the organization and supplier partners to ensure smooth end-to-end delivery of the projects.
- Identify, design, develop and test database architectures and largescale processing solutions.
- Guiding the project delivery team and Hands on development work.
- Recommend and implement ways to improve data reliability, efficiency and quality. Propose tools and languages to marry data sources together to hunt down opportunities provided by the data acquisition
- Develop data set processes for data modeling, mining and production and leverage large volumes of data
- Automate work to build pipelines to harness data from their source to be used in advanced analytic techniques.
Willing to undertake domestic or international travel as required by the projects.
- A minimum of a bachelor’s degree in Computer Engineering/IT is required.
- A minimum of 3-5 years of practical work experience in an IT, engineering, or related technical field is required.
- Expert in SQL/Impala/Hive or similar query languages.
- Advanced knowledge with big data tools: Hadoop, Spark, Kafka, etc.
- Expert in Writing UNIX Shell Scripts, Oozie;
- Experience of leading and delivering data engineering projects is required.
- Demonstrable project experience using cloud technologies such as Microsoft Azure Data factory is required.
- Should be full stack engineering, big data technologies and Devops subject matter expert.
- Preferred Knowledge, Skills and Abilities:
- Knowledge and working experience as a techno-functional consultant on ERP systems such as SAP or JDE is preferred.
- Knowledge on Java/Python/Spark/Scala is preferred.
- Familiarity with Version control repository like Bitbucket/GIT etc.
- Familiarity with Machine Learning techniques.
- Familiarity with Cognos/Informatica/Teradata.
- Familiarity with Visualization tools such as Tableau, Alteryx or Power BI is a plus
- Well versed with Agile methodology and experience of project delivery using Jira, Confluence.
United States-New Jersey-Bridgewater-
Johnson & Johnson Services Inc. (6090)