Senior data engineer
SCORIFY is a successfully growing company bringing cutting edge technologies based on big data analytics, artificial intelligence, and machine learning. Our solutions are being used by financial institutions, peer-to-peer lending platforms, telcos, and central banks. We are looking for a strong addition to the team who will join us in creating new AI models and services.
Your responsibilities:
- Data Pipeline Development:
- Develop, test and maintain scalable data pipelines and workflows.
- Implement and manage ETL/ELT processes to transform, aggregate and move data from various sources to data warehouses.
- Ensure data integrity and quality through the data pipeline
- Develop, test and maintain scalable data pipelines and workflows.
- CI/CD Pipelines:
- Develop and maintain CI/CD pipelines for data workflows.
- Automate data integration and deployment processes between environments.
- Develop and maintain CI/CD pipelines for data workflows.
- Big Data Management:
- Implement real-time data streaming solutions where necessary.
- Optimize data processing for speed and efficiency.
- Implement real-time data streaming solutions where necessary.
- Monitoring and Troubleshooting:
- Continuously monitor data systems for performance and reliability.
- Troubleshoot and resolve data-related issues.
- Implement monitoring and alerting systems.
- Continuously monitor data systems for performance and reliability.
- Refactor Existing Data Pipelines:
- Assess and analyze existing data pipelines to identify inefficiencies, bottlenecks, and areas for improvement.
- Update and modernize data workflows to leverage the latest technologies and best practices.
- Assess and analyze existing data pipelines to identify inefficiencies, bottlenecks, and areas for improvement.
- Collaboration:
- Work closely with data scientist, analysts and business stakeholders.
- Participate in project planning and management activities to ensure data engineering tasks are aligned with project timelines and goals.
- Provide technical support and guidance to team members, helping them resolve data-related issues and challenges.
- Work closely with data scientist, analysts and business stakeholders.
What we expect from you:
- Technologies:
- PostgreSQL, MSSQL
- Data Warehousing
- Apache Airflow or similar
- Python
- Git
- Docker
- Jira
- At least 3 years of experience in Data Engineering.
- Proven expertise in end-to-end data pipelines.
- Proficiency in Python and SQL for data processing, scripting, automation and database management tasks.
- Experience in designing and implementing ETL/ELT workflows.
- Good self-management and ability to meet agreed sprint goals based on prioritization discussed with the delivery lead.
What you can expect from us:
- Opportunity to showcase your expertise and fully own data-related processes.
- Competitive salary and benefits package.
- Opportunity to work with a talented and passionate team.
- Professional development and growth opportunities.
- A collaborative and innovative work environment.
- The salary range for this position is € 4500 - € 6000 gross per month.