Chennai
Principal Architect
Engineering & Technology Team
On site
Company Description
ABOUT JMAN:
- JMAN Group is a growing technology-enabled management consultancy that empowers organizations to create value through data.
- Founded in 2010, we are a team of 450+ consultants based in London, UK, and a team of 300+ engineers in Chennai, India. Having delivered multiple projects in the US, we are now opening a new office in New York to help us support and grow our US client base.
- We approach business problems with the mindset of a management consultancy and the capabilities of a tech company. We work across all sectors, and have in depth experience in private equity, pharmaceuticals, government departments and high-street chains.
- Our team is as cutting edge as our work. We take pride for ourselves on being great to work with – no jargon or corporate-speak, flexible to change and receptive of feedback.
- We have a huge focus on investing in the training and professional development of our team, to ensure they can deliver high quality work and shape our journey to becoming a globally recognized brand. The business has grown quickly in the last 3 years with no signs of slowing down.
Why work at JMAN?
* Our vision is to ensure JMAN Group is the passport to our team’s future. We want our team to go on a fast-paced, high-growth journey with us – when our people want to do something else, the skills, training, exposure, and values that JMAN has instilled in them should open doors all over the world.
• Current Benefits:
− Competitive annual bonus
− Market-leading private health insurance
− Regular company socials
− Annual company away days
− Extensive training opportunities
Position
Technical specification:
- Minimum 12+ years of experience as a Principal/Data Architect or similar role, with proven expertise in developing and implementing Data & Analytics strategies.
- Strong understanding and experience with modern data warehouse / Lakehouse solutions like Databricks, Snowflake, Redshift, Synapse, and proficiency in cloud platforms such as AWS, Azure preferred.
- Thorough grasp of data governance and security principles, along with experience in data pipelines and ETL/ELT tools, including Databricks, dbt, Azure Synapse.
- Excellent communication, collaboration, and problem-solving skills are essential.
- Proficiency in Apache Spark, SQL or Python programming languages, knowledge of data patterns, data modelling i.e., Data Vault, and product migration.
- Good to have skills include data visualization using Power BI, Tableau, or Looker, and familiarity with full-stack technologies.
- Familiarity with AI/ML platforms and data preparation for ML initiatives is a plus.
Responsibilities:
- Develop and implement a comprehensive data strategy aligned with business objectives.
- Design and architect a modern data platform using cloud technologies (e.g., AWS, Azure, GCP).
- Lead in architecting data landscapes that to reflect data strategies including Data integration, transformation, governance, modelling, BI.
- Build and manage a scalable, secure data warehouse / lakehouse leveraging modern solutions, including implementing data ingestion pipelines from disparate sources.
- Implement CI/CD pipelines for data pipelines and infrastructure using DevOps tools and methodologies.
- Design and implement the framework for data migration from legacy to modern cloud technologies.
- Implement data governance and security best practices to ensure data quality, compliance, and manage metadata.
- Collaborate with our global team to articulate the data & technology and empower our global team to understand the importance of technology on our solutions.
- Collaborate with cross-functional teams to understand data needs, develop data models, and build analytics tools for reporting, analytics, and AI/ML initiatives.
- Effectively communicate technology solutions to clients using clear and concise, non-technical language. Lead the workshop with Client’s location to assess the existing data & analytics and provide suitable solutions.
- Design the frameworks and create products for delivery efficiency.
- Stay updated on emerging data technologies, recommend innovations, and evaluate/recommend cloud data platforms.
- Manage and lead a team of data engineers, providing guidance, monitoring end-to-end operational processes, and overseeing the development and maintenance of data pipelines to ensure quality, reliability, security, and scalability.
- Diagnose existing architecture and data maturity, identifying gaps, proposing solutions, and implementing dimensional modelling and business domain conversion/Data Vault design pattern.
Requirements
Required Skillset:
- ETL or ELT: Databricks / Azure Synapse / dbt / Fivetran (Anyone - Mandatory).
- Data Warehouse / Lakehouse: Databricks / Snowflake / Fabric / Synapse (Anyone - Mandatory)
- Cloud Experience: AWS / Azure / GCP
- SQL and Apache Spark / Python programming languages
- Data Visualization: Power BI / Tableau / Looker. (Anyone-Good to have).
- Version control & Release management: GitHub / Azure DevOps / CI CD pipelines
- Data Governance: Microsoft Purview / Atlan
- Full stack technologies (Good to have)
- Data Modelling: Data Vault, Dimensional modelling
- AI/ML platforms