Data Engineer
Job Description: Data Engineer
Key Responsibilities:
Design, develop, and maintain ETL/ELT pipelines for data ingestion and processing
Build scalable data architectures (data lakes, data warehouses, batch + streaming pipelines)
Work with cross-functional teams to understand data requirements and deliver solutions
Optimize data flow, improve reliability, performance, and quality of data
Develop and maintain data models, schemas, and metadata
Implement data validation, cleansing, and quality checks
Collaborate with analysts, BI teams, and ML engineers to support data needs
Ensure data security, compliance, and governance practices
Monitor, troubleshoot, and enhance data pipeline performance
Document systems, processes, and workflows for future reference
Required Skills & Experience:
Strong experience in SQL and relational databases (PostgreSQL, MySQL, SQL Server, etc.)
Hands-on experience with Python or Scala
Expertise in ETL tools and orchestration (Airflow, DBT, Luigi, etc.)
Experience with cloud platforms (AWS / Azure / GCP)
Familiarity with Big Data technologies like:
Hadoop
Spark
Kafka
Hive
Knowledge of data warehouse solutions (Snowflake, Redshift, BigQuery, Databricks)
Good understanding of data modeling (Star/Snowflake schema)
Experience handling large-scale datasets
Strong problem-solving and debugging skills