Noob trying to get Conan + GCC docker image built for my libs

I am just starting my journey with my C++ Conan code… and want to begin getting a CI routine in place that builds and pushes my C++ conan packages to gitlab registry… I have been doing this manually for quite some time but ready to start automation BUT… when I start to try to any any Docker images and apt install ... I get NOT FOUND errors left and right… do I need to upgrade my account to make this work or ?

Dockerfile

FROM ubuntu:20.04

RUN apt install python3 python3-pip -y
RUN apt install automake make cmake automake bison flex g++ libboost-all-dev libevent-dev libssl-dev libtool make pkg-config -y

RUN pip3 install conan

RUN python --version
RUN gcc --version

.gitlab-ci.yml

stages:
  - build-image          # List of stages for jobs, and their order of execution
  - build
  - test
  - deploy

build-image:
  stage: build-image
  image: docker
  services: 
    - docker:dind
  script:
    - echo $CI_REGISTRY_PASSWORD | docker login -u $CI_REGISTRY_USER $CI_REGISTRY --password-stdin
    - docker build -t $CI_REGISTRY_IMAGE .
    - docker push $CI_REGISTRY_IMAGE


build-job:       # This job runs in the build stage, which runs first.
  stage: build
  image: conanio/gcc9
  script:
    - sudo apt install automake make cmake -y
    - pip3 install conan
    - echo "Compiling the code..."
    - echo "Compile complete."

unit-test-job:   # This job runs in the test stage.
  stage: test    # It only starts when the job in the build stage completes successfully.
  script:
    - echo "Running unit tests... This will take about 60 seconds."
    - sleep 60
    - echo "Code coverage is 90%"

lint-test-job:   # This job also runs in the test stage.
  stage: test    # It can run at the same time as unit-test-job (in parallel).
  script:
    - echo "Linting code... This will take about 10 seconds."
    - sleep 10
    - echo "No lint issues found."

deploy-job:      # This job runs in the deploy stage.
  stage: deploy  # It only runs when *both* jobs in the test stage complete successfully.
  script:
    - echo "Deploying application..."
    - echo "Application successfully deployed."

The job run

Step 1/6 : FROM ubuntu:20.04
20.04: Pulling from library/ubuntu
08c01a0ec47e: Pulling fs layer
08c01a0ec47e: Verifying Checksum
08c01a0ec47e: Download complete
08c01a0ec47e: Pull complete
Digest: sha256:669e010b58baf5beb2836b253c1fd5768333f0d1dbcb834f7c07a4dc93f474be
Status: Downloaded newer image for ubuntu:20.04
 ---> 54c9d81cbb44
Step 2/6 : RUN apt install python3 python3-pip -y
 ---> Running in 84f04e86dccb
WARNING: apt does not have a stable CLI interface. Use with caution in scripts.
Reading package lists...
Building dependency tree...
Reading state information...
E: Unable to locate package python3
E: Unable to locate package python3-pip
The command '/bin/sh -c apt install python3 python3-pip -y' returned a non-zero code: 100
Cleaning up project directory and file based variables 00:01
ERROR: Job failed: exit code 100

Do I need to upgrade my … account or? what am I missing… I figured Ubuntu 20.04 … apt install python3 … should be quite common workflow to need or?

EDIT:

Seems the solution was simply to run

FROM ubuntu:20.04

ARG DEBIAN_FRONTEND=noninteractive

RUN apt update
RUN apt upgrade -y

without this it simply won’t find packages … which to me is, I guess?, by design…?I normally avoid Docker containerization so… this was painful but I guess I am unblocked from using Ubuntu now and build my custom container