About SlyceAt Slyce, we’re reinventing how food delivery operations work. Our globally unique SaaS platform is already trusted by some of the world’s largest restaurant brands, such as Domino’s. By combining cutting-edge technology with deep food industry expertise, we help restaurants expand their reach, boost revenue, and build lasting customer loyalty.Our founders previously built and scaled Honest Food, the world’s largest virtual restaurant brand, which was successfully acquired by Delivery Hero. We are building the next big thing in food-tech - If you want to make a real impact, this is your chance.Role OverviewWe’re looking for a Senior Data Analyst who empowers our C‑level to make data-driven decisions every day. You will report to the Chief Data Officer and will be part of the data team. In your day-to-day work, you will work closely with the CEO and COO and dive deep into the performance of our algorithms for our clients.You will help the business prepare case studies to win trials and help us understand what works and what does not. Additionally, you will work closely with Data Engineering and Data Science to improve the performance of our decision-making.This is a unique opportunity to work closely with the founding team in a fast-growing startup.What You'll DoOwn performance analytics: Analyze how our algorithms impact restaurant performance (orders, revenue, margin, marketing spend) and define the KPIs that matter.Partner with C‑level: Work directly with the CEO, COO, and CDO to answer strategic questions, prioritize opportunities, and support key decisions with data.Build case studies & narratives: Create clear, compelling case studies and ROI analyses to support sales, customer success, and marketing.Design metrics & dashboards: Define core metrics and build dashboards that give the business a real-time view of performance across markets, products, and cohorts.Collaborate with Data Science & Engineering: Work with DS/DE to evaluate model performance, define experiments, and translate business needs into data and modeling requirements.Champion data quality & self‑service: Help improve our data models, documentation, and tooling so that stakeholders can reliably self‑serve for routine questions.What We’re Looking ForExperience: 3+ years of experience in data analytics, product analytics, or analytics engineering.SQL & warehousing: Strong proficiency in SQL and experience working with large datasets in a cloud data warehouse (BigQuery or similar).Analytics & statistics: Solid understanding of analytic methods and experimentation (A/B testing, cohort analysis, basic statistical inference).Python for analysis: Strong proficiency in Python for data processing and automation (e.g. pandas, Jupyter/Colab, simple scripts).Data modeling & transformations: Experience working with well-structured data models; comfortable reading and contributing to dbt-based transformations.Workflow literacy: Hands-on experience with modern data workflows and orchestration (e.g. Airflow or Cloud Composer), at least as a power user.BI & visualization: Experience building reports and dashboards in a BI tool (e.g. Looker, Tableau, Metabase, Mode, Power BI).Stakeholder communication: Strong communication skills, especially in explaining complex analyses to non-technical stakeholders and C‑level.Solution-oriented mindset: Ability to prioritize and find solutions that are appropriate for the available time.Collaboration: Ability to work both independently and in cross-functional teams, proactively driving topics to completion.Work authorization: EU work authorization required.Nice to Have SkillsDomain knowledge: Experience with marketplaces, food delivery, or digital marketing/advertising.ML product analytics: Familiarity with evaluating and monitoring ML-driven products (recommendation, bidding, optimization).GCP experience: Previous experience with GCP (BigQuery, Cloud Composer, Pub/Sub, Vertex AI) or GCP Professional Data Engineer certification.Data quality: Understanding of data quality frameworks (e.g. Great Expectations, Soda) and how to implement basic checks.Streaming & real-time: Experience with streaming or event-based data (Pub/Sub, Kafka, Dataflow, or similar).APIs & integrations: Experience working with RESTful APIs and third-party platform integrations.Infrastructure as Code: Knowledge of tools like Terraform or Pulumi is a plus, especially if you’ve worked closely with data infrastructure.Our Tech StackData & ML: Cloud Composer (Airflow), dbt, Python, BigQuery, Cloud Storage, Pub/Sub, Vertex AIOther: TypeScript, Docker, Kubernetes, Pulumi, GCPWhat We OfferOpportunity to shape a category-defining food tech platform at an early stageBuild scalable ML-powered infrastructure using modern data tools on GCPDirect collaboration with founders and senior leadershipA culture of ownership, learning, and continuous growthCompetitive compensation aligned with market standardsHybrid work culture with flexible working stylesApplication ProcessReady to actively shape the future of food delivery? Send your CV, portfolio, and a few lines on why you’re a great fit for Slyce to till@slyce.ioLet’s create something extraordinary - together. 🚀
Responsibilities
Job Requirements
Apply now