There are a couple of things that affect the runtimes.
In general when you create a new robot, open a robot or edit the conda.yaml we need to setup that environment. Even a small environment in python is usually around 200MB and usually a lot of small files.
RCC builds a cache out the environments so that only the 1st run is actually loading the environment. RCC also detects if the live environment has been polluted. So if you for example after the environment has been build manually go into the environment and install something with pip for example, then on the next run RCC will restore the environment to the state that conda.yaml dictates. This is done to ensure and isolate the execution environment from any unknown changes.
I think in the log you shared there we are getting a cache hit so I think you might be hitting the Defender case below.
On Windows machines especially the disk speed takes a hit from Windows Defender (or other antivirus tools) as they basically have to scan each small file before letting them through. This can basically drop the disk speed to half.
It is quite common to create exclusion rules for Defender in developer folder and we have a guide for this.
A bit of a long answer, but this is one of the key benefits we want to provide with RCC so we are a bit passionate on this .
You do not have fight with setting up different python versions, pip installs etc. etc. which break everything whenever you touch something or try to run on another machine… just define the conda.yaml and RCC does it for you.
Sneak-peak: We also have the next evolution of the RCC environment cache in the pipeline to reduce the disk footprint by a LOT