Status: Open
- Employment
Organisation: Internews
Deadline: Ongoing
Location: Remote (United Kingdom or United States)
Internews is seeking to hire a Data Engineering Specialist, Media Viability Accelerator (MVA) who will head up an important function for a cutting-edge, online platform that will set a new standard for the industry — developing, testing, and deploying data components for the MVA.
Reporting to the Media Business Unit’s Head of Technology, the Data Engineering Specialist will oversee Azure Synapse, notebooks, pipelines, and related resources.
This role entails managing data pipelines, databases, and visualizations for the MVA, including Power BI embedded analytics. Additionally, the role involves implementing data scraping for enriched datasets and liaising with the Back-End Web Development Specialist.
Responsibilities
- Commitment to understanding the MVA’s subject matter, purpose, datasets, and core technologies to ensure meaningful contribution to the program’s overall objectives.
- Contribute to defining the scope, objectives, and activities of the MVA.
- Collaborating with the MVA Forward program team, which leads the rollout of the platform globally, ensuring partner and stakeholder feedback is reflected in the MVA.
- Working closely with colleagues to improve application usability and feature integration.
- Data Engineering
Required qualifications
- A minimum of 10 years of prior relevant experience, including at least three years of professional experience as a Data Engineer, with a proven track record of successful projects.
- Microsoft Azure Data Engineer certification, indicating in-depth knowledge of Azure services and best practices.
- Familiarity with Azure cloud services including ADLs, Azure Key Vault, and Azure Batch.
- Advanced experience or certifications in Azure cloud services and solutions strongly preferred.
- Database and data warehousing expertise, including experience with Azure SQL Database and Azure Synapse Analytics for data storage and warehousing.
- Ability to write and maintain Python scripts for data scraping tasks, highlighting versatility in programming language and frameworks.
- Integration experience, including OAuth authentication and working with RESTful APIs for data retrieval, manipulation, and third-party service integration.
- Proficiency with continuous integration and continuous deployment (CI/CD) practices and tools, along with experience in using version control systems like Git, ensuring best practices in code management and deployment workflows.
- Excellent written and spoken English skills.
- Proven interpersonal skills, including excellent team-building and communication skills.
- Experience with agile software development in a team.
Preferred qualifications
- Experience working with performance metrics for data visualization.
- Ability to troubleshoot complex issues and develop effective solutions.
- Extensive experience with best practices in data security and compliance.
- Experience working with news media.
- Proficiency in any other spoken language
How to apply
Applications will be reviewed on a rolling basis. To learn more and apply, please click here.