Experience and Responsibilities:
- 5+ years of experience in data engineering, with a focus on designing and managing scalable data architectures.
- Proven experience in building data models and pipelines for IoT or event-driven platforms.
- Strong proficiency in Python or Scala for data processing.
- Expertise in data modeling (OLTP, OLAP, star/snowflake schemas) and database optimization.
- Experience with big data tools and technologies (e.g., Spark, Kafka, AWS Kinesis, Apache Flink).
- Proficient in working with time-series databases and IoT data (e.g., InfluxDB, TimescaleDB, AWS Timestream).
- Strong knowledge of cloud platforms (preferably AWS) and infrastructure as code tools (Terraform, CloudFormation).
- Experience with ETL/ELT processes and orchestration tools (e.g., Airflow, Prefect).
- Understanding of data governance, security, and privacy standards.
- Comfortable working in Agile teams, using JIRA, Git, and CI/CD pipelines.
- Bachelor’s degree in Computer Science, Engineering, or a related field preferred.
Skills and Personal Characteristics:
- Experience with machine learning pipelines or real-time analytics is a plus.
- Familiarity with data lakes and lakehouse architectures.
- Ability to collaborate with backend engineers, and product teams.
- Excellent problem-solving skills and attention to detail.
- Passion for building systems that power smart, connected devices.