Data Architect | Adelaide
|Attachments:||No File Attached|
|Application Close Date:||26-Feb-2018|
About the role
The Data Architect will expand and optimise our clients' data and data pipeline architecture, build, optimise, and maintain conceptual and logical database models as well as optimise the data flow and collection for cross functional teams. Your responsibilities include:
- Architect database solutions to store and retrieve company information
- Analyse structural requirements for new software and applications
- Define strategy and architecture to migrate data from legacy systems to new solutions, data pipelines for data collection, data analytics and other data movement solutions.
- Assess and propose the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using ETL, Database Replication, Application Integration, SQL and ‘big data' technologies.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
- Work with data and analytics experts to strive for greater functionality in data systems.
- Design conceptual and logical data models and flowcharts
- Assess and propose methods to improve system performance by conducting tests, troubleshooting and integrating new elements
- Define security and backup procedures
- Coordinate with the Data Science department to identify future needs and requirements
You will have the ability to optimise data systems and prepare strategy & architecture to build them from the ground up.
Essential skills and experience
- Proven work experience as a Data Architect or similar role
- In-depth understanding of database structure principles
- Experience gathering and analysing system requirements
- Technical expertise regarding data acquisitions, data transformation, data models, database design development, data mining and segmentation techniques
- Strong knowledge of and experience with Data Management and ETL technologies (e.g. SAP BO Data Services, SAP PO, IBM Message Broker and IBM Message Queue)
- Strong knowledge of and experience with Big Data integration and streaming technologies (e.g. Kafka, Flume, Talend etc.)
- Strong knowledge of and experience with Big Data processing technologies (e.g. Spark with Scala / Java / Python / R)
- Exposure to migrate data into ERPs (SAP CRM, IBM Case Management etc.), databases (HDFS, HBase, HANA, DB2, Teradata, MS SQL etc)
- Familiarity with data visualization tools (e.g. Zoomdata, QlikView, Tableau, D3.js and R)
- Experience with cloud services
- Proven analytical skills, problem-solving attitude and working experience as a data engineer
- BS in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
What we can offer you?
Capgemini is a world leader in technology enabled change. We can offer our consultants;
- Formal training with industry recognized certifications
- The ability to interact with peers in a sharing and inclusive community
- Exciting and challenging projects
- A well-structured and tailored career framework
- A culture of collaboration and recognition
- Excellent remuneration