Creating optimal data pipeline architecture, assemble large, complex data sets that meet all business requirements.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS 'big data' technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
Experience building and optimizing 'big data' data pipelines, architectures and data sets.
Strong analytic skills related to working with unstructured datasets.
Experience with big data tools: Hadoop, Spark, Kafka, etc.
Experience with relational SQL and NoSQL databases.
Experience with AWS cloud services
Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
To find out more about Progressive Recruitment please visit our website.
Award Winner for:
Best Medium Recruitment Company of the Year by Recruitment International 2018
Training & Development Initiative of the Year by Recruitment International 2018