Data Delivery Engineer
Data Delivery Engineer
Category: Data Science
Location: Bucharest, Romania
Referral Reward: € 1,000 - € 2,000
Employment Type: Employment contract
Time Allocation: Full Time
Workplace Type: Hybrid
Remaining positions: 3
What's in it for you:
- Pioneering work at the forefront of Big Data, Infrastructure and Machine Learning that sometimes means relying on academic papers as our only formal guiding tools in moving onwards.
- The chance to shape an original product in a fast growing company, backed by VC investors
- A merit-based, no BS, value-oriented and particularly collaborative work environment
- Fair compensation, the possibility of ESOP and upward salary growth based on performance.
- Groundbreaking professional challenges and the autonomy to try stuff for the very first time, no matter how experienced you are.
- The promise to use the right tool for the job to be able to channel your efforts on what can’t be compensated for — the best use of your skills and abilities.
- The space to try new things out (your way) paired with the freedom to make reasoned decisions and own them.
What we expect from you in return:
- High speed and uncompromising quality in your work
- A growth mindset, able to capitalize on unprecedented contexts through your skills and abilities
- An appetite to grapple with a variety of technical challenges
- The ability to quickly and effectively evaluate technical tradeoffs and translate them into relevant scenarios
- Consistent performance over time coordinated with a matching set of values
- Strong problem solving skills that enhance the way you deal with the tension between brief and shipping
- Aversion to the idea of any customer or colleague struggling with what you delivered
- High tolerance for ambiguity, marked by your ambition to push forward with incomplete information
- Resilience, especially in front of failure, that is always paired with pioneering work
Your qualifications:
- Previous experience as a Data Engineer, Data Analyst, Data Scientist or a similar role.
- Experience with data warehousing, data modeling, and ETL processes.
- Excellent knowledge of ETL processes and data pipeline development.
- Proficiency in working with SQL and statistical analysis tools.
- Experience using statistical computer languages (e.g. Python, Scala, R) to manipulate and analyze data.
- Experience with big data technologies (Spark, Cassandra) is a plus.
- Familiarity with statistical techniques and their applications.
- Familiarity with data architecture principles and best practices.
- Any experience with predictive modeling and data optimization techniques is a plus.
- Attention to detail and the ability to maintain focus while analyzing large datasets.
- Experience writing efficient and testable code.
- Ability to quickly prototype and implement practical solutions.
- Strong communication and collaboration skills to work effectively with cross-functional teams.
- English - advanced.
Recruitment process:
- An assignment
- 2 discussions, one of which is face to face
Responsibilities:
- Work closely with the Customer Success and Data Engineering teams to address clients' data needs and enhance data delivery processes.
- Design, develop, maintain and optimize ETL processes and data pipelines to ensure timely and accurate data delivery.
- Continuously monitor and improve the performance and accuracy of data pipelines.
- Work with data sources, transformation tools, and scheduling mechanisms to efficiently orchestrate data flows efficiently.
- Implement data validation checks and quality assurance processes to ensure accuracy, consistency and reliability of delivered data.
- Use statistical analysis (spreadsheets, SQL etc.) to identify and resolve data anomalies, maintaining data quality standards.
- Develop custom data models and algorithms to optimize data extraction and processing.
- Generate and deliver accurate datasets to clients and partners.
- Prototype quickly to solve complex use cases, prioritizing practical solutions over theoretical ones.
- Prepare and verify data samples and usage reports for collaboration with Sales and Growth teams.
- Actively engage with team members and stakeholders to understand data delivery requirements and objectives.
- Identify areas of opportunity and improvement throughout data delivery processes.