
Data Engineer
Cribl,
We’re looking for a Data Engineer to help build and scale the systems that power analytics, data science, and operational decision-making across Cribl. As part of the Data Engineering team, you’ll work at the intersection of the modern data stack (Snowflake, SQL, dbt), software engineering (cloud applications, IaC, observability), and our emerging AI/agentic workflows.
This is a high-impact role: the pipelines and services you build will directly influence how teams across Cribl understand the business, execute better decisions, and experiment with new AI-driven experiences. You will be solving meaningful problems, collaborating with smart teammates, and building reliable systems in a fast-moving environment. We pride ourselves on fostering a collaborative and innovative culture where team members enjoy working together—whether remotely or over a meal at a foodie hot spot. If you're someone who thrives in an entrepreneurial environment and is eager to contribute to a company poised for legendary success in the tech industry, we want to hear from you!
As An Active Member Of Our Team, You Will...
- Build, operate, and monitor Cribl’s core data tech stack including data pipelines, data integrations and our data warehouse ensuring data is accurate, timely, and trusted
- Develop cloud-native services and infrastructure that power scalable, reliable data systems, with logging, alerting, and observability as first-class concerns
- Contribute to infrastructure-as-code (Terraform or similar), clean deployment patterns, and operational hygiene
- Support Cribl’s growing data science and agentic initiatives by preparing model-ready datasets, exposing features, and integrating AI/LLM workflows into production systems
- Work closely with Analysts and business stakeholders to clarify requirements, validate data outputs, and translate business logic into reliable data artifacts
- Partner closely with Data Analysts, Site Reliability Engineers, and IT Engineers on initiatives that align with business needs to clarify requirements and validate data outputs
- Communicate risks, tradeoffs, and timelines proactively to keep work predictable
- Contribute to secure, compliance-minded engineering practices in collaboration with IT/Security
- We are a remote-first company and work happens across many time-zones - you may be required to occasionally perform duties outside your standard working hours
If You Got It - We Want It
- Strong SQL and Python fundamentals; experience with ELT patterns and data modeling
- Exposure to Snowflake or a similar cloud warehouse (databricks, redshift, duckdb), and familiarity with dbt or equivalent frameworks
- Experience building cloud applications or backend services (APIs, ingestion services, event-driven workflows)
- Hands-on experience with AWS cloud infrastructure and infrastructure-as-code such as Terraform
- Familiarity with workflow orchestration (e.g., Prefect, Airflow) and production-grade engineering practices (logging, alerting, versioning, CI/CD)
- Clear, concise communication and the ability to collaborate across data, engineering, and business teams
