Principal Data Engineer at Brightcove

20

Location: Remote, US
Benefits: Career Development

Job Description:

BrightcoveWe are seeking an experienced and highly skilled Principal Data Engineer to join our dynamic team. In this role, you will play a pivotal role in the data modernization and will be responsible for designing, developing, and maintaining scalable data infrastructure and pipelines that support our organization’s data needs. You will leverage your expertise in Java, Python, Snowflake, GCP, AWS, APIs, batch processing, DBT, Kubernetes, CI/CD tools, monitoring/alerting, and data governance to ensure robust and efficient data solutions. Additionally, you will play a crucial role in coaching and mentoring junior engineers to foster their growth and development.

Key Responsibilities:

Architect and Design Data Systems: Lead the design and implementation of scalable data architectures and pipelines using Snowflake, GCP, AWS, and other technologies. Ensure data systems are efficient, reliable, and meet organizational needs. Develop and optimize data warehouse design and architecture to enhance performance and scalability.
Develop and Maintain Data Pipelines: Build and optimize data pipelines and ETL processes using Python, Java, and DBT. Handle batch processing and integrate with APIs as needed to facilitate data flow.
Data Infrastructure Management: Oversee the management and optimization of data infrastructure components, including cloud platforms (GCP, AWS) and container orchestration tools (Kubernetes).
CI/CD Integration: Implement and manage continuous integration and continuous deployment (CI/CD) processes for data engineering workflows using relevant tools and technologies.
Monitoring and Alerting: Set up and manage monitoring and alerting systems to ensure data pipelines and infrastructure are operating smoothly. Troubleshoot and resolve issues as they arise.
Data Governance: Establish and enforce data governance practices to ensure data quality, security, and compliance. Develop policies and procedures for data stewardship, data privacy, and data lifecycle management.
API Development: Build and integrate APIs to facilitate data exchange and ensure seamless connectivity between different systems and platforms.
Coaching and Collaboration: Provide guidance and mentorship to junior data engineers and team members. Foster a collaborative environment that encourages learning and professional development. Work closely with data scientists, analysts, and other stakeholders to understand their data requirements and deliver solutions that meet their needs.
Documentation: Maintain comprehensive documentation for data pipelines, architecture designs, and processes. Ensure documentation is up-to-date and accessible to team members. Keep up with industry trends, emerging technologies, and best practices to ensure the data engineering team remains at the forefront of technology
Experience and Qualifications:

12+ years of experience in data engineering or a related field.
Proven track record of designing and implementing large-scale data systems and pipelines.
Extensive experience with Snowflake, GCP, AWS, Kafka and Kubernetes.
Strong proficiency in Java and Python.
Hands-on experience with batch processing and data transformation using Airflow and DBT.
Proven experience in building and integrating APIs.
Fluency in data warehouse design and optimization techniques.
Expertise in data architecture and system design.
Proficiency in using CI/CD tools for data workflows.
Strong understanding of data governance practices and data quality management.
Advanced skills in data warehouse design, performance tuning, and optimization.
Strong analytical and problem-solving skills.
Excellent communication and interpersonal skills.
Ability to coach and mentor team members effectively.
Preferred Qualifications:

Bachelor’s degree in Computer Science, Engineering, Data Science, or a related field. Advanced degree is a plus.
Certification in relevant technologies (e.g., AWS Certified Data Analytics, Google Professional Data Engineer).
Experience with advanced monitoring and alerting tools.
Familiarity with data governance frameworks and compliance standards (e.g., GDPR, CCPA).
About Brightcove

Brightcove is a diverse, global team of smart, passionate people who are revolutionizing the way organizations deliver video. We’re hyped up about storytelling, and about helping organizations reach their audiences in bold and innovative ways. When video is done right, it can have a powerful and lasting effect. Hearts open. Minds change.

Since 2004, Brightcove has been supporting customers that are some of the largest media companies, enterprises, events, and non-profit organizations in the world. There are over 600 Brightcovers globally, each of us representing our unique talents and we have built a culture that values authenticity, individual empowerment, excellence and collaboration. This culture enables us to harness the incredible power of video and create an environment where you will want to grow, stay and thrive. Bottom line: We take our video seriously, and we take great pride in doing it as #oneteam.

 

APPLY FOR THE JOB

NEXT JOBS