Job Description
PineBridge Investments is Hiring for Data Engineer
Responsibilities:
- Developing and deploying streamlined data and CICD pipelines (ETLs) to seamlessly integrate data from diverse sources into the Snowflake warehouse.
- Orchestrating data model adjustments to adhere to warehouse standards.
- Implementing backfill processes and other data management protocols within the warehouse.
- Devising and executing rigorous testing methodologies to ensure optimal quality of warehouse data.
- Providing comprehensive documentation, training, and advisory services to users of the data warehouse.
- Cultivating a robust data and BI mindset to effectively manage competing priorities.
- Contributing insights and suggestions for continuous enhancement of operational processes.
Skills:
- Extensive hands-on experience (4+ years) in crafting and deploying fully functional solutions on the Snowflake Data Warehouse platform.
- Proficiency in utilizing various Snowflake utilities including SnowSQL, SnowPipe, SnowFlake data share, Streams, tasks, and stored procedures.
- Mastery of SQL over a span of at least four years.
- Advanced proficiency (4+ years) in constructing frameworks with Python.
- Strong aptitude in constructing data pipelines using AWS technologies like S3, Lambda, EC2, Step Functions, and CloudFormation.
- Proficiency in programming, scripting, and data science languages such as PowerShell, R, SQL, and JavaScript.
- Experience in constructing data models encompassing conceptual, logical, and physical schemas for both relational and dimensional databases.
- Comprehensive comprehension of Data Warehouse and BI principles.
- Familiarity with Big Data principles for organizing both structured and unstructured data is highly advantageous.
- Bachelor’s degree or higher in a technology-related discipline (e.g., Engineering, Computer Science), Master’s degree considered a bonus.
Must Have:
- Proficiency in AWS services such as S3, Lambda, Step Functions, CloudFormation, AWS Glue, and AWS EC2.
- Familiarity with Cloud Databases like Snowflake, AWS Aurora, and AWS RDS.
- Expertise in Data Integration/Programming languages including Python, Java, C#, SQL, and Stored Procedures.
- Familiarity with ETL/ELT Tools like Fivetran, Matillion, and/or Informatica.
- Experience with Orchestration tools such as Apache Airflow, Autosys, and Tidal.
Good to Have:
- Additional proficiency in other AWS services.
- Experience in data management encompassing data governance and data operations.
- Familiarity with BI tools like Power BI, Tableau, Python Plotly, and AWS QuickSight.
- Knowledge of CI/CD practices.
- Exposure to the asset management industry.