• Contract
  • London
  • £800 - £850 per day USD / Year

IT Architect

£800-850 a day

Remote Flexibility

You will be required to:
* Collaborate with stakeholders to devise a data strategy that addresses the data needs
* Build an inventory of data needed to implement the architecture & perform initial exploratory data analysis
* Create a fluid, end-to-end vision for how data will flow through the organization
* Develop data models for a data warehouse, data marts and operational data stores using different schema design methodologies
* Design analytical business views that incorporate a simpler and more natural presentation of data to enable self-service analytics and reporting
* Contribute to E2E Solution Architectures (with a cost/benefit approach)
* Implement measures to improve data accuracy and accessibility
* Develop Data Quality metrics for source and target data sets
* Document and publish Metadata and table designs to facilitate data adoption
* Design new tools with a focus on direct data ingestion from devices for IoT, faster feedback loops, real time data processing, machine learning, segmentation, personalisation and decision support
* Identify and evaluate data ingestion and management technologies that can be leveraged to industrialise and automate data ingestion and processing
* Research new opportunities for data acquisition
* Ensure data protection measures including anonymisation, access controls, PCI DSS compliance and EU GDPR compliance are adhered to
* Attend Technical Design Forums to present/approve solution and data architectures.
* Constantly monitor, refine and report on the performance of data management systems

You will need to have:
* Hands on, demonstrable, experience of defining data architecture and strategic direction to deliver Data Platform capabilities to support BI/Analytics/Reporting, Personalisation or Advanced Analytics (AI/ML) use cases
* Hands on, demonstrable, experience with data warehousing, data mart design, and data lake technologies
* Hands on, demonstrable experience with data cataloguing, data profiling and data quality tools
* Have proven experience leveraging Big Data offerings on Google Cloud Platform (GCP) like BigQuery, Dataflow, Dataproc, Dataprep, Pub/Sub, DLP etc. or their functional equivalents on Amazon Web Services or Microsoft Azure
* Appreciation of architectural principles to build systems that support very high concurrency, are highly available, and highly resilient in the face of dependent component failures
* Understanding of machine learning for garnering better insights from data
* Demonstrable experience of defining and delivering solutions using Lambda and Kappa architectures, with emphasis on Apache Kafka based streaming solutions
* Ability to anticipate business needs and translate them into technical solutions
* Ability to conceptualise and design platforms capable of supporting multiple systems or applications
* Ability to apply modelling techniques as needed to data, logical and physical architectures, ideally using data modelling tools like ER Studio, Erwin, etc.
* Strong analysis and problem-solving skills
* Good consultancy and influencing skills, working within strong technical environments
* Good written and verbal communication skills and the ability to work in a global matrix organisation
* Adept at translating complex technical concepts into meaningful recommendations
* Excellent planning and organisational skills

Upload your CV/resume or any other relevant file. Max. file size: 50 MB.

Loading...