Mandatory skills :TSQL, P-SQL, Reporting Platform, Data visualization, Database Administration, Performance tuning, Python
Job Description:
We are looking for a highly motivated and skilled Data Engineer, experience with SQL, Databricks, data platforms, and cloud-based data services. Also, responsible for enabling business intelligence insights using Power BI, Tableau, and other dashboarding tools.
Key Responsibilities:
Data Pipeline Development: Design, build, and maintain scalable and reliable data pipelines to ingest, process, and transform data using SQL, Databricks, and other data technologies.
SQL & Database Management: Write optimized SQL queries to extract and manipulate data from relational and non-relational databases.
Cloud Data Platforms: Work with cloud-based platforms (such as AWS, Azure, or GCP) for data storage and ETL processes
Python Development: Leverage Python for data extraction, transformation, and automation tasks. Develop Python scripts for data processing, analysis, and system integration.
BI Tools & Dashboards: Design and implement visual dashboards and reports using Power BI, and Tableau
Collaboration: Work closely with Data Scientists, Business Analysts, and IT teams to understand data requirements and provide timely data solutions.
Performance Tuning & Optimization: Optimize data workflows for performance and reliability.
Documentation: Maintain clear documentation of data processes, architecture, and workflows.
Required Skills & Qualifications:
SQL: Proficiency in writing complex queries, optimizing performance, and managing data in SQL Server or other relational databases.
Databricks: Experience with Databricks, including using it for ETL, data processing, and integration with other data tools.
Data Platforms: Familiarity with cloud data platforms like Azure Data Factory, AWS Redshift, or Google Big Query.
Python: Strong experience in Python for data manipulation, automation, and integration with other systems.
Business Intelligence Tools: Expertise with data visualization and reporting tools such as Power BI and Tableau.
Data Modeling & ETL: Proficient in data modeling, ETL processes, and data warehousing concepts.
Version Control: Experience with version control tools like Git.
Communication Skills: Ability to effectively communicate technical information to non-technical stakeholders.
Preferred Qualifications:
Experience with containerized applications (Docker, Kubernetes).
Familiarity with Machine Learning and AI models.
Knowledge of Data Governance and Data Security best practices.
Familiarity with Agile and Scrum methodologies.
Education:
Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent practical experience).