·Design and build the infrastructure required for optimal (ETL) extraction,
transformation, and loading of data from a wide variety of data sources.
·Support data scientists to prepare and acquire data so it can be used in
analytics models.
·Assess data acquisition requests based on priority, available resources, and effort required.
·Develop solution architecture for the data pipeline utilizing data
reference architecture.
·Build optimal data pipeline architecture to provide actionable insights
into customer acquisition, operational efficiency and other key business
performance metrics.
·Profile sample data based on defined business and data quality rules,
acquire data and ingest from data source to target environment.
·Initiate improvements of data models that feed business intelligence
tools, increasing data accessibility and fostering data-driven decision making
across the organization.
·Design and implement processes and systems to monitor data quality,
ensuring production data is always accurate and available for the organization
stakeholders and business processes that depend on it.
· Identify valuable data sources for a wide range of functions, automate collection processes.