- Career Center Home
- Search Jobs
- Senior Data Engineer
Results
Job Details
Explore Location
Cognizant
Sydney, AUSTRALIA
(on-site)
Job Function
Financial Services
Senior Data Engineer
The insights provided are generated by AI and may contain inaccuracies. Please independently verify any critical information before relying on it.
Senior Data Engineer
The insights provided are generated by AI and may contain inaccuracies. Please independently verify any critical information before relying on it.
Description
What makes Cognizant a unique place to work? The combination of rapid growth and an international and innovative environment! This is creating many opportunities for people like YOU - people with an entrepreneurial spirit who want to make a difference in this world.At Cognizant, together with your colleagues from all around the world, you will collaborate on creating solutions for the world's leading companies and help them become more flexible, more innovative, and successful. Moreover, this is your chance to be part of the success story.
Position Summary:
• We're looking for a Senior Data Engineer to design and build scalable data solutions on the Azure cloud. The role involves developing high performance data pipelines using Azure Databricks, Synapse, ADLS, and Delta Lake, working with complex insurance data (including Guidewire DataHub), and enabling analytics and Power BI reporting.
• You will work closely with business stakeholders, analysts, and technology teams to deliver trusted, governed datasets-particularly within the insurance domain, including platforms such as Guidewire Datahub.
Mandatory Skills:
Core Technical Skills
• Strong proficiency in Azure Databricks, PySpark, Azure Synapse, ADLS Gen2, and Delta Lake.
• Hands-on experience with Guidewire DataHub, Policy/Claim/Billing data models, and insurance data structures.
• Expertise in ETL/ELT pipeline development, data transformation, and performance tuning.
• Solid experience with SQL, Spark optimization, partitioning, and orchestration frameworks.
• Experience working with Power BI for dashboard and report development.
Domain Knowledge
• Strong understanding of P&C insurance domain, including policy, claims, billing, underwriting, and operations.
• Familiarity with insurance KPIs, metrics, and analytics use cases.
Cloud, DevOps & Security
• Experience with CI/CD pipelines, version control (Git), and automated deployment processes.
• Good understanding of data governance, metadata management, and data security standards.
Soft Skills
• Excellent requirement gathering and stakeholder management skills.
• Strong analytical thinking, problem solving ability, and attention to detail.
• Ability to lead onsite data initiatives and coordinate with distributed teams.
Duties and Responsibilities:
Data Engineering & Architecture
• Design, develop, and maintain end to end ETL/ELT data pipelines using Azure Databricks (PySpark), Azure Synapse, and Azure Data Factory.
• Integrate Guidewire DataHub and other insurance data sources into Delta Lake / Lakehouse environments.
• Develop scalable and reusable data ingestion, transformation, and validation frameworks.
• Optimize data pipelines for performance, reliability, and cost efficiency.
Data Modeling & Analytics
• Build logical, conceptual, and physical data models aligned with insurance domain requirements.
• Prepare analytical datasets for reporting, dashboarding, and advanced analytics use cases.
• Develop interactive Power BI dashboards and visualizations to deliver actionable insights.
Data Governance & Quality
• Implement data quality checks, reconciliation logic, and validation rules.
• Maintain data lineage, metadata documentation, and governance best practices.
• Ensure adherence to security standards, including RBAC, encryption, and access controls.
Stakeholder Collaboration
• Work closely with business teams, product owners, and SMEs to gather requirements and translate them into technical solutions.
• Act as an onsite liaison, supporting offshore teams through clarifications, priority setting, and technical guidance.
• Support UAT, production deployment, and ongoing maintenance of data solutions.
Continuous Improvement
• Drive best practices in data engineering, DevOps, and cloud-native development.
• Evaluate new tools, frameworks, and architectural patterns for continuous enhancement.
• Provide technical mentoring and support to junior data engineers and analysts.
Qualifications & Certifications (Optional):
• Databricks Certified Associate Data Engineer (good to have)
• Bachelor of Engineering in Computer Science or equivalent stream
• Strong Azure data engineering experience
• Insurance domain knowledge preferred
• Passion for building reliable, analytics ready data platforms
Salary Range: >$80,000 to $90,000
Date of Posting:30-Mar-26
Next Steps: If you feel this opportunity suits you, or Cognizant is the type of organization you would like to join, we want to have a conversation with you! Please apply directly with us.
For a complete list of open opportunities with Cognizant, visit http://www.cognizant.com/careers. Cognizant is committed to providing Equal Employment Opportunities. Successful candidates will be required to undergo a background check.
Job ID: 83214183
Jobs You May Like
Median Salary
Net Salary per month
$4,062
Cost of Living Index
79/100
79
Median Apartment Rent in City Center
(1-3 Bedroom)
$2,512
-
$4,665
$3,589
Safety Index
66/100
66
Utilities
Basic
(Electricity, heating, cooling, water, garbage for 915 sq ft apartment)
$120
-
$373
$207
High-Speed Internet
$42
-
$70
$55
Transportation
Gasoline
(1 gallon)
$5.11
Taxi Ride
(1 mile)
$2.83
Data is collected and updated regularly using reputable sources, including corporate websites and governmental reporting institutions.
Loading...