top of page
< Back

Playing the Pricing Game

I recently worked on a project for Coolblue, one of the largest Dutch electronics retailers. In a team of four, we investigated how competitors were responding to each other on price, and whether there was a more deliberate strategy for Coolblue to play in the cellphone and vacuum cleaner markets.

In fast moving categories like the ones we were given, staying still can feel like conceding ground. But we found that reacting to everything carries its own hidden costs like operational churn, noisy decision making, and a pricing posture that, from a customer's perspective, can look erratic.

We made it a priority to reframe the project from a dashboard problem to a decision system problem early on. The goal was to build a disciplined framework for distinguishing what matters from what doesn't, and to identify the response patterns that were actually compounding into an advantage over time.

Our Approach


After some early bumps around process and alignment, we started by clearly defining the boundaries of the problem and the framework we intended to test, then worked through a deliberate sequence:

How we explored, what we found, and what those findings implied for day-to-day commercial decisions.

Coolblue's feedback aligned well with this approach. They valued the logical coherence, the clarity of the framing, and the fact that we translated the analysis into recommendations that felt immediately deployable.

How We Got There


Before any modeling, we did extensive research spanning data science fundamentals like how to treat events over time, alongside game theory and economics literature to understand competitive responses, pricing incentives, and market behavior. Balancing that academic grounding with practitioner sources mattered too, because in a commercial environment, relevance is just as important as rigor.

The most effortful part, and also the most underestimated, was defining and cleaning the data carefully enough to trust it. Pricing datasets look well-structured on the surface, but with millions of observations and several columns, separating meaningful competitive moves from promotional effects, stock-related fluctuations, and tactical noise gets complicated quickly. Investing heavily here made everything downstream more defensible.

From that foundation, we looked at how response behavior varied by context, what timing was actually signaling, and which price moves were worth treating as meaningful.

What the Client Valued


Going in, I'd been pretty focused on winning. After we presented, I was so satisfied with the outcome that winning almost felt secondary. I had doubts along the way because our presentation leaned more commercial than technical, and I wasn't always sure that was the right call. But I'd come to trust, through work experience and with my team, that leading with strategic clarity over model mechanics was genuinely the more useful thing to give them.

We did win. And the feedback was consistent with what our gut had told us. They found the output usable. We packaged our insights into four recommendations meant to work together as a strategic playbook. One connected directly to a cost reduction for a line of analysts; another surfaced a dimension they weren't actively tracking. I can't go into specifics due to the NDA, but the larger lesson is this: the best analyses change what a team pays attention to (and what it can stop worrying about) in a simple and logical way.

What I'm taking


  • Framing is the golden rule in applied data science. Without defining the decision and its boundaries early, models can be technically sound but answering a question nobody actually needed answered.


  • Speed is not the same as intelligence. For a company in fast markets, the instinct is to move quickly. The actual advantage is in moving constantly and selectively, with a framework you can trust under pressure. The same applied to us as a team. We wanted to jump straight to the analysis, but the insights we almost skipped over in the middle were the reason we eventually reshaped the problem statement entirely.


  • Communication is a technical skill. Getting from analysis to action requires narrative clarity, and that gap is where a lot of commercial value gets lost. Constant overcommunication within the team about progress, blockers, and framing kept our final delivery coherent and consistent.

Looking Back


We won a Dutch oven, by the way which I'm super thrilled about. As someone who doesn't eat meat I have a lot of exploring to do lol. But beyond the prize, building something that felt like it could survive contact with the real world was quite rewarding. As I near the end of my student career (for now), there's something meaningful about doing the kind of project I've only ever done with clean, structured class data, but with all the messy data, unavoidable tradeoffs, competing stakeholder expectations. Moving from thinking like a student to designing a decision process a business can actually sustain is a different kind of challenge and I'm excited to carry that into whatever comes next.

bottom of page