Data persists between builds (pipelines) in Docker container executor

I have a problem that when I run a pipeline, Docker executor uses data from previous pipleline. I would expect, that every time I launch a new build (ex. bynew commit), I should have a fresh container, and fresh data cloned from Git repo. That is not the case. My project is Buildroot config (Buildroot is build system for building Linux images) data, that has Buildroot repo as submodule. First time Buildroot is run, it downloads all needed packages, and makes tar balls from them. The problem is particularly with this packages - Buildroot should download them every time it needs it, but it already has them downloaded.
I didn’t setup any cache, and even if I did, I suppose it works only between jobs in a single pipeline, not between separate pipelines.
Is it a bug, or am I missing something?
I use Gitlab 9.1.4, and Gitlab Runner (linux/amd64) 11.1.0

Did you manage to fix this or find the underlying issue? Facing something similar…