Tarif
At Johnson & Johnson, we believe health is everything. Our strength in healthcare innovation empowers us to build a world where complex diseases are prevented, treated, and cured, where treatments are smarter and less invasive, and solutions are personal. Through our expertise in Innovative Medicine and MedTech, we are uniquely positioned to innovate across the full spectrum of healthcare solutions today to deliver the breakthroughs of tomorrow, and profoundly impact health for humanity. Learn more at https://www.jnj.com/.
Major Duties & Responsibilities
- Develop data science solutions: Business problem understanding, data processing, algorithms to discover trends & patterns, data visualization, generation of key insights and communication to senior stakeholders.
- Write production-level and scalable code in Python or R programming languages
- Implement machine learning and statistical models for Predictive, Time series forecasting, Decision Support, Gen AI, NLP etc.
- Execute Data Science projects from an end-to-end perspective: From data acquisition, data pre-processing, modelling, visualization and full integration with end-user systems and processes.
- Develop data processing and modelling pipelines for large datasets from finance and ERP systems such as SAP HANA.
- Present results, key insights and explain complex analysis in a simple manner to senior leaders across different functions to help drive data-driven decision making and adoption. Leverage data visualization/ dashboarding tools for interactive presentations.
- Deploy analytical models into production. Take accountability for requirements, procedures and standards such as version control and documentation.
- Automation of data pipelines and orchestration of workflows.
- Leverage and advance robust data architecture and cloud-based technology stack to support production-level work.
- Partner with cross-functional teams: Technology, Finance Systems and Data, Group Finance and CFOs, Compliance, Data Engineering, and other Data Scientists to ensure project success.
- Keep up to date to apply latest technology trends in the areas of data science and artificial intelligence, including Generative AI.
- Flexibility to work in a global team (spanning across multiple time zones), with a culture of continuous innovation, teamwork, collaboration, and inclusion among team members.
- Work on projects of varying length and complexity in an agile manner with work prioritization and deadlines.
- Advocate and facilitate the use of data-driven insights and data science methods across J&Js Global Finance organization.
- Understanding J&Js business as well as Global Finance function to better apply Data Science solutions.
Nitelik
Basic Qualifications
- Required Minimum Experience: At least 3 to 4 years of relevant experience.
- Bachelors/ Master’s in: Computer Science, Engineering, Data Science, Business Analytics, Science, Finance or any quantitative or STEM discipline. Preferred: Master’s or PhD degree in a relevant field.
- Ability to work and collaborate with a global team, innovate, and drive for common goals and achievements.
- Experience of executing Data Science projects from initiation through PoC to Production
- Knowledge of Big Data, Data Pipelining, Machine Learning, intermediate level of Software Development and Automation skills
- Must have a strong Commercial focus, a solid background in Data Science and experience with Machine Learning/ Statistics. Data science experience in a Finance or Commercial setting is strongly preferred.
- Ability to connect and partner with various stakeholders (cross-functional) including senior executives to identify their needs and drive adoption of the solutions.
- Experience in managing compliance adherence, risk management and stakeholder commitments in large organizations.
Technical Requirements
- Strong experience in writing production-level code in Python/R programming languages
- Hands-on experience with Cloud stack: AWS, Microsoft Azure, Domino etc.
- Expertise in variety of Machine Learning/ AI techniques (clustering, decision tree, neural networks, deep learning, Gen AI etc.)
- Expertise in advanced statistical techniques and concepts (regression, properties of distributions, statistical tests etc.)
- Expertise with Time Series data and Natural Language Processing (NLP)
- Experience in developing end-to-end production pipelines, taking a model from proof-of-concept to production-grade.
- Experience with Model Deployment and Monitoring, APIs
- Experience with Advanced Excel including Power Pivot. PowerPoint
- Knowledge of automated workflow tools such as Alteryx, and workflow orchestration tools such as Airflow and Kubernetes
- Knowledge of Tableau/ PowerBI/ Dash for data visualization and dashboarding
- Knowledge of machine learning frameworks (e.g., TensorFlow, scikit-learn).
- Knowledge in use of Generative AI and LLMs such as AWS Bedrock, Open AI, AWS Bedrock, Meta LLama
- Standards, scientific rigor and best practices around version control, technical documentation, and agile methodology. Leveraging tools such as JIRA, Confluence and Git.
- Experience with SQL to acquire and manipulate data, knowledge about data warehouse, ERP systems such as SAP HANA, Data Lakes: AWS S3
- Continuous improvement mindset, critical thinking and innovation. Drive to learn and master new technologies.
- Able to work in environment which requires work prioritization, deadline changes, risk management, compliance adherence etc.
- Experience working across multiple levels of stakeholders, creating context around key business drivers & strategic plan