Rebrandly has been recognized as one of the leading software companies globally by G2 Crowd for 2024. The company's branded link-management platform ensures that links are safe, secure, and engaging for individuals, agencies, businesses, and developers, driving optimal performance. Rebrandly follows a product-led growth (PLG) model, where the product plays a central role in user acquisition, retention, and revenue expansion. With a diverse and global team spanning the US, Ireland, Italy, India, Spain, the UK, Poland, and the Philippines, Rebrandly caters to tens of thousands of customers in over 30 countries. Established in Italy in 2015, Rebrandly has operated as a remote-first company since its inception, headquartered in the US, with offices in Rome and Dublin. Rebrandly serves millions of users and global brands worldwide, including Versace, Toyota, PayPal, and Zillow. For more information, please visit www.rebrandly.com.
Purpose
As a Data Engineer, you will play a key role in shaping how we collect, process, and utilize data to make informed decisions that improve user experience and enhance product performance. Your responsibilities will include designing, constructing, and maintaining the infrastructure that delivers data to our internal stakeholders. As the sole data engineer in our growing company, you'll have the unique opportunity to architect our data infrastructure from the ground up and shape our data strategy. The data infrastructure you design will feed into our global data warehouse and support company-wide tools with customer, financial, and operational data. Your work will directly influence company strategy and operations, with high visibility across all departments. You will be integral to data-driven decision-making across the product, marketing, and growth teams. This position will be fully remote in the Eastern North America or Ireland.
About You
- Problem solver at heart, passionate about working with large datasets and transforming data into insights.
- Thrive in a collaborative environment where your technical expertise contributes to a shared mission.
- Experience using modern data infrastructure to build reliable, scalable, and efficient data solutions.
- Experience building data systems from scratch and working independently.
What You’ll Do
- Design, build, and maintain the infrastructure to collect, store, and transform data from different sources.
- Collaborate with business stakeholders, data analysts, product teams, and other engineers to understand requirements and provide efficient solutions.
- Develop and optimize data pipelines to support business analytics, machine learning models, and product features.
- Ensure data quality, governance, and integrity in all aspects of our data infrastructure.
- Identify opportunities to improve existing data workflows and propose creative solutions.
- Establish pipelines for efficient data movement and transformation.
- Design flexible internal and external APIs to make our data accessible.
- Assist internal teams in maintaining and creating user-tracking events.
- Provide guidance to other engineers on best practices and implementation methods.
- Lead the design and roadmap for our data platform.
- Partner with operations, product, and engineering to advocate best practices and build supporting systems and infrastructure for various data needs.
- Own the ingest and egress frameworks for data pipelines that stitch together various data sources to produce valuable data products that drive the business.
We’d Love To Hear From You If You Have:
- 3+ years of experience designing data solutions, modeling data, and developing ETL/ELT pipelines at large scale.
- 3+ years of experience in a programming language (e.g., Python, Java, Scala, Golang).
- Experience developing large-scale data solutions using cloud platforms (e.g., AWS, Azure, Google Cloud) and big data technologies (e.g., Apache Spark).
- Experience in data cataloging, classification, data quality, metadata management, and data lifecycle.
- Experience with SQL and data visualization/analytics tools (e.g., Tableau, Looker Studio).
- Experience with CI/CD pipelines and source control.
- Experience with Infrastructure as Code tools like Terraform.
- Proficiency in API design and development of RESTful web services or GraphQL.
- Working knowledge of containerization technologies like Kubernetes and Docker.
- Understand the importance of engineering with security’s best practices in mind.
- Experience working with product analytics tools and tracking user behavior.
- Experience with growth experimentation and A/B testing frameworks.
- Understanding of SaaS metrics (CAC, LTV, churn, expansion revenue, etc.).
- Empathetic: eager to see the world from a user's perspective.
- Self-starter: energized by business impact and capable of driving the direction of ambiguous projects.
- Drive to take ownership of and debug complex data transformation pipelines and data models.
- Strong problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
- Ability to work independently and as part of a team.
- Excellent organizational and time management skills.
- Adaptability to learn and work with new technologies.
- Passion for maintaining high data quality standards.
While It’s Not Required, It’s An Added Plus If You Also Have:
- Experience with machine learning techniques and tools.
We encourage curious and passionate individuals to apply. Rebrandly is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran, or disability status.