Principal Technology Architect

About JMAN:

JMAN Group is a growing technology-enabled management consultancy that empowers organizations to create value through data. Founded in 2010, we are a team of 100+ consultants based in London, UK, and a team of 170+ engineers in Chennai, India. Having delivered multiple projects in the US, we are now opening a new office in New York to help us support and grow our US client base. We approach business problems with the mindset of a management consultancy and the capabilities of a tech company. We work across all sectors, and have in depth experience in private equity, pharmaceuticals, government departments and high-street chains. Our team is as cutting edge as our work. We take pride for ourselves on being great to work with – no jargon or corporate-speak, flexible to change and receptive of feedback. We have a huge focus on investing in the training and professional development of our team, to ensure they can deliver high quality work and shape our journey to becoming a globally recognised brand. The business has grown quickly in the last 3 years with no signs of slowing down.

Why Work at JMAN?

Our vision is to ensure JMAN is the passport to our team’s future. We want our team to go on a fast-paced, high-growth journey with us – when our people want to do something else, the skills, training, exposure, and values that JMAN has instilled in them should open doors all over the world.

Current Benefits:
  • Competitive annual bonus
  • Market-leading private health insurance
  • Regular company socials
  • Annual company away days
  • Extensive training opportunities
Technical specification:
  • Minimum 5+ years of experience as a Data Architect or similar role, with proven expertise in developing and implementing data strategies.
  • Strong understanding and experience with modern data warehouse solutions like Snowflake, Redshift, Synapse, and proficiency in cloud platforms such as AWS, Azure, or GCP preferred.
  • Thorough grasp of data governance and security principles, along with experience in data pipelines and ETL/ELT tools, including AWS Glue/Azure Data Factory/Synapse, Matillion/dbt.
  • Familiarity with AI/ML platforms and data preparation for ML initiatives is a plus.
  • Excellent communication, collaboration, and problem-solving skills are essential.
  • Proficiency in Apache Spark or Python programming languages, knowledge of data patterns, data modelling, and product migration.
  • Good to have skills include data visualization using Power BI, Tableau, or Looker, and familiarity with full-stack technologies.
Responsibilities:
  • Develop and implement a comprehensive data strategy aligned with business objectives.
  • Design and architect a modern data platform using cloud technologies (e.g., AWS, Azure, GCP).
  • Build and manage a scalable, secure data warehouse leveraging modern solutions, including implementing data ingestion pipelines for various sources.
  • Implement CI/CD pipelines for data pipelines and infrastructure using DevOps tools and methodologies.
  • Implement data governance and security best practices to ensure data quality, compliance, and manage metadata.
  • Collaborate with cross-functional teams to understand data needs, develop data models, and build analytics tools for reporting, analytics, and AI/ML initiatives.
  • Stay updated on emerging data technologies, recommend innovations, and evaluate/recommend cloud data platforms.
  • Manage and lead a team of data engineers, providing guidance, monitoring end-to-end operational processes, and overseeing the development and maintenance of data pipelines to ensure quality, reliability, security, and scalability.
  • Diagnose existing architecture and data maturity, identifying gaps, proposing solutions, and implementing dimensional modelling and business domain conversion/Data Vault design pattern.
Required Skillset:
  • ETL or ELT: AWS Glue/ Azure Data Factory/ Synapse/ Matillion/ dbt (Anyone – Mandatory).
  • Data Warehouse: Azure SQL Server/Redshift/Big Query/Databricks/Snowflake (Anyone – Mandatory).
  • Cloud Experience: AWS/Azure/GCP.
  • Quality, Reliability, Security, and Scalability- Data Pipelines
  • SQL and Apache Spark / Python programming languages
  • Data Visualization: Power BI/Tableau/Looker. (Anyone-Good to have).
  • Full stack technologies (Good to have)
  • AI/ML platforms
  • Scripting and DevOps tools (e.g., Git, CI/CD pipeline)
  • Data Governance.
  • Data Modelling.
  • Product migration.

 

 

    Apply now

             
       
    YesNo

    We value your privacy. Selecting "I ACCEPT" confirms that you have read and accepted JMAN's Terms and Conditions. We need you to do this before submitting your application. For further information on how your privacy is safe with us, you can also read our Terms and Conditions before submitting.