Data Architect (Tallinn, Estonia)

Permanent
Estonia
Posted 3 years ago

Data Architect (Tallinn, Estonia)

Our client is a technology solutions company passionate about its customer-tailored product development. From requirements definition and specification, software coding and development, to application support and maintenance; they understand and assist with the entire product lifecycle. They strive to deliver the highest possible business output on every task and project they undertake.

Our client is currently hiring for a Data Architect. You will be challenged with mastering Data Layer and its persistent Technology Layer from the Enterprise Architecture Domains.

Responsibilities

  • Passionate about data flows, you will apply extensive knowledge of data modelling from the business level down to the physical data models while working with stakeholders and technical product owners.
  • You will be challenged with data integration, data quality, and master data management best practices when working out with integrations for internal and third parties purposes.
  • You will have an opportunity to decide on the technology layer for data processing and persistence.
  • You will be mastering the data lake program for the company and will be assisting with data warehouse design.
  • When you are pointing out the business needs for new data processing or data management technologies, feel free to engage yourself with the proof-of-concept and spend some time on R&D activities with development teams (of course, following SAFe).
  • While in charge of solution design, you will also be responsible for solution design to be delivered. You are welcome to apply proactive collaboration as an obvious way to get there.

Qualifications

  • Experience designing and implementing data lakes with cloud providers (AWS, GCP).
  • Experience in data modelling and data management for technologies like Apache Kafka, Elasticsearch, and CockroachDB.
  • Extensive experience with NoSQL databases, e.g., Cassandra, Google Data Store/MongoDB, Neo4j.
  • Deep experience with big data technologies, e.g., Apache Nifi, Spark, Hadoop, Hive, AWS S3/GCP data storage, and AWS Kinesis.
  • Expert level of understanding of big data, distributed computing, and at least one cloud provider.
  • Extensive knowledge of data modelling from the business level down to the physical data models.

Job Features

Job CategoryInformation Technology
Salary / Hourly RateAttractive
Job TypePermanent
Start DateImmediately

Apply For This Job

Apply Now

A valid phone number is required.