Data Engineer, Commerce

Remote
Full Time
Recurrent Ventures
Experienced

The Opportunity:

We are looking for someone to join the growing Commerce team at Recurrent. As Data Engineer for Commerce you will transform data into a format that can be easily analyzed and deploy data products in a multiplatform ecosystem. You’ll develop, test, and maintain data pipelines for millions of dollars in transactions and hundreds of thousands of users every month. You will help to scale computational environments, establish cloud infrastructure and design across the Commerce team as well as inform, develop, and deploy microservices and scale data science models.

Our team works across the entire stack that supports shopping content and other buying opportunities across our sites. We focus on execution, working quickly to meet our business’s needs while at the same time developing a framework for scalable, long term success. We’re building off of a mix of internally developed and external tools to find the best in class tech stack.

We are looking for an engineer who can assume responsibility for the architecture and management of our data platform. You will be the principal data engineer in charge of ETL (Extract, Transform, Load), validation, and optimization of complex sets of often incomplete information to meet business needs. You will not only analyze complex data elements, systems, dependencies and relationships but also develop the logical and physical data models including the data warehouse. 

You will work with stakeholders, including the Executive and Product leads, to support data infrastructure needs while assisting with data-related technical issues. A strong desire to learn and teach others is important.

This role will involve development of prototype applications and production code. It’s an excellent opportunity for a curious engineer who enjoys a mix of both theory and application in a fast growing and exciting space within digital media. 

 

The Responsabilities:

  • Build and optimize data sets, ‘big data' data pipelines, and architectures.
  • Deploy analytics programs, machine learning and statistical methods.
  • Build processes that support data transformation, workload management, data structures, dependency and metadata.
  • Prepare data for predictive and prescriptive modeling.
  • Perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions that arise.
  • Responsible for understanding and complying with applicable quality, environmental and safety regulatory considerations. If accountable for the work of others, responsible for ensuring their understanding and compliance.

OUR CURRENT TECH STACK:

Frontend languages and Libraries

  • React.js, Next.js, 
  • PHP

Backend

  • Wordpress
  • AWS
  • Redis, Fastly
  • S3, BigQuery
  • GraphQL
  • Python
  • Adverity

 

The Qualifications:

We encourage all candidates with some of these qualifications to apply. Everyone’s background is different and what matters to us is the willingness and excitement to work on affiliate technology.

  • Bachelor’s Degree or higher in Computer Science, Information Systems, MIS, Mathematics, Systems Engineering, Data Sciences, Economics, or Physics or similar.
  • At least 2 years of experience working in a product or platform development environment reflecting increasing levels of responsibility.
  • Experience in 2 of the following Database technologies: MySQL, Elasticsearch, MongoDB, RedShift, Postgres, Neo4J, Oracle, Hana, Snowflake, MsSQL.
  • Experience building and optimizing ‘big data’ data pipelines and architectures.
  • Experience with data visualization tools such as Looker, Google Data Studio, etc.
  • Experience processing data streams.
  • 2 years programming experience within Python.
  • Ability to write ANSI SQL queries.
  • Experience creating Containerized workloads.
  • Capable of performing root cause analysis on internal and external processes and data to identify opportunities for improvement and to answer questions.
  • Competent with Big data tools such as Hadoop, Spark, Kafka, etc.
  • Experience with AWS cloud services: Ec2, EMR, RDS, Redshift, Kinesis, etc.
  • Knowledge of data modeling and schema design complete with proven data transformation capabilities.
  • Excellent analytical skills associated with working on datasets.

Bonus if you have:

  • Industry experience in digital advertising, digital media, or affiliate marketing
  • Experience in M&A, i.e. merging new technologies or engineering teams through acquisitions and integrating disparate systems and workflows into a unified ecosystem
  • AWS Certified in Big Data.
  • AWS Certified Solution Architect.

 

The Benefits & Perks:

  • Medical, dental, vision & life insurance
  • Unlimited PTO
  • Remote - work from anywhere!
  • Offices in Miami, San Francisco, New York, San Diego
  • Parental leave
  • Matching 401k
  • Equity package
Share

Apply for this position

Required*
Apply with
We've received your resume. Click here to update it.
Attach resume as .pdf, .doc, .docx, .odt, .txt, or .rtf (limit 5MB) or Paste resume

Paste your resume here or Attach resume file

Human Check*