Job Description

Posted on:

October 27, 2024

As a Data Engineer on our Professional Services team, you will be responsible for working with customers to understand their needs and requirements while discussing the 'art of the possible.' You will design and implement solutions for data warehouses, data lakes, ETL jobs, and data pipelines using AWS Services, as well as design and implement AI/ML solutions using AWS or IBM services such as Bedrock and WatsonX. Your key responsibilities will include:

  • Consulting with customers to:
    • Understand their data management strategy
    • Provide insights into optimizing their data strategy
    • Architect and design data management processes
    • Excite customers on how AI/ML services can enhance their business
  • Implementing and deploying machine learning models and enabling advanced analytics
  • Documenting data architectures and data flows
  • Driving innovation internally and for customers
  • Contributing to R&D projects which may turn into new service offerings

To be successful in this role, you will need to:

  • Live and breathe the “cloud-first” approach
  • Think analytically to solve complex business problems
  • Obsessively deliver amazing customer experiences
  • Track new developments in one or more cloud platforms
  • Build trusting relationships with all team members
  • Push boundaries and technical limits
  • Stay updated on industry trends and continue learning

We are hiring two engineers, and the qualifications for the Mid-level range include:

  • 5+ years of professional service experience with customer-facing responsibilities
  • 2+ years of professional AWS and/or WatsonX experience
  • At least one AWS or Google Certification
  • Proficiency in one or more programming languages: Python, R, Java, Scala
  • Experience with:
    • SQL and NoSQL databases like PostgreSQL, MySQL, MongoDB, Cassandra
    • Data modeling: conceptual, logical, and physical data models
    • ETL tools like Informatica, Talend, Pentaho
    • Big data frameworks like Hadoop, Spark, or Kafka
    • Machine learning frameworks: Tensorflow, PyTorch, Keras, Scikit-Learn
    • Business Intelligence tools: Power BI and/or QuickSight
    • Dimensional modeling, star schemas, and data warehouses
    • Data architecture patterns like lambda and kappa architecture
    • Designing scalable and flexible data pipelines
    • Working within standard agile methodologies

For the Sr-level range, additional qualifications include:

  • 5+ years consulting/professional service experience with customer-facing responsibilities
  • 4+ years of professional AWS and/or WatsonX experience
  • At least one AWS Professional Level Certification
  • Proficiency in one or more programming languages: Python, R, Java, Scala
  • Experience with:
    • Designing and building data pipelines
    • Creating machine learning models or using LLMs
    • SQL and NoSQL databases
    • Data modeling, dimensional modeling, star schemas, and data warehouses
    • ETL tools like Glue, Informatica, Talend, Pentaho
    • Big data frameworks
    • Machine learning frameworks
    • Business Intelligence tools
    • Data architecture patterns
    • Designing scalable and flexible data pipelines
    • Working within standard agile methodologies

The salary range provided is a general guideline. When extending an offer, Innovative considers factors including, but not limited to, the responsibilities of the specific role, market conditions, geographic location, as well as the candidate’s professional experience, key skills, and education/training.

Secret insights

Innovative Solutions is booming with a 40% growth in engineering, signaling a strong focus on tech talent. HR is solid, showcasing their commitment to employee well-being. Grab opportunities here!