\( \newcommand{\P}[]{\unicode{xB6}} \newcommand{\AA}[]{\unicode{x212B}} \newcommand{\empty}[]{\emptyset} \newcommand{\O}[]{\emptyset} \newcommand{\Alpha}[]{Α} \newcommand{\Beta}[]{Β} \newcommand{\Epsilon}[]{Ε} \newcommand{\Iota}[]{Ι} \newcommand{\Kappa}[]{Κ} \newcommand{\Rho}[]{Ρ} \newcommand{\Tau}[]{Τ} \newcommand{\Zeta}[]{Ζ} \newcommand{\Mu}[]{\unicode{x039C}} \newcommand{\Chi}[]{Χ} \newcommand{\Eta}[]{\unicode{x0397}} \newcommand{\Nu}[]{\unicode{x039D}} \newcommand{\Omicron}[]{\unicode{x039F}} \DeclareMathOperator{\sgn}{sgn} \def\oiint{\mathop{\vcenter{\mathchoice{\huge\unicode{x222F}\,}{\unicode{x222F}}{\unicode{x222F}}{\unicode{x222F}}}\,}\nolimits} \def\oiiint{\mathop{\vcenter{\mathchoice{\huge\unicode{x2230}\,}{\unicode{x2230}}{\unicode{x2230}}{\unicode{x2230}}}\,}\nolimits} \)

CARLA is a simulator for self-driving cars.

Getting Started

Installation

Follow the steps at QuickStart.

pip install --user pygame numpy
sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 92635A407F7A020C
sudo add-apt-repository "deb [arch=amd64 trusted=yes] http://dist.carla.org/carla-0.9.9/ all main"
sudo apt-get update
sudo apt-get install carla-simulator
cd /opt/carla-simulator

Running a simulation

  • Start the server by running /opt/carla-simulator/bin/CarlaUE4.sh

Now you will need to write a client in Python. The client handles setting up the scene and all sensor data. See Tutorial: Retrieve Data and scroll down to tutorial_ego.py.

Notes
  • If you want a clean shutdown, you need to disable autopilot on the ego_vehicle before you destroy it.

Build From Source

To develop vehicles or make significant changes, you will want to build from source.

Build From Source

See Linux build summary

  • Get Unreal Engine cloning the UnrealEngine repo and building it.
    • You will need an Epic Games account linked to your GitHub to have access to the source code.
  • Clone the carla repo and download assets.

Sensors

Sensors Reference

RGB Image

Depth Image

Depths images encode as z-depth to camera as 8-bit images.
To get the meters from the 8-bit RGB image, apply the following formula:

normalized = (R + G * 256 + B * 256 * 256) / (256 * 256 * 256 - 1)
in_meters = 1000 * normalized

I recommend resaving the depth images as 16-bit PNG images.
This makes the depth values less sensitive to rounding errors, even though 8-bit PNG images are lossless.

To convert depth values to euclidean distance, you need to divide by \(\cos(\theta)\) where \(\theta\) is the angle to the center of the image.

Z-depth to Euclidean Depth


Below is code to convert z-depth to euclidean depth for an equirectangular image stitched from cubemaps.
Adapt it accordingly.

theta, phi = np.meshgrid((np.arange(width) + 0.5) * (2 * np.pi / width),
                        (np.arange(height) + 0.5) * (np.pi / height))
uvs, uv_sides = my_helpers.spherical_to_cubemap(theta.reshape(-1),
                                               phi.reshape(-1))
cubemap_uvs = uvs.reshape(height, width, 2)
# cubemap_uv_sides = uv_sides.reshape(height, width)
world_coords = my_helpers.spherical_to_cartesian(
   cubemap_uvs[:, :, 0] * np.pi / 2 + np.pi / 4,
   cubemap_uvs[:, :, 1] * np.pi / 2 + np.pi / 4)
cos_angle = world_coords @ np.array([0, 0, 1], dtype=np.float32)
return depth_image / cos_angle[np.newaxis, :, :, np.newaxis]
  • Here cubemap_uvs are the UVs of each equirectangular pixel within the cubemap.

Spherical to cubemap is defined as follows:

def spherical_to_cubemap(theta, phi):
  """Converts spherical coordinates to cubemap coordinates.

  Args:
    theta: Longitude (azimuthal angle) in radians. [0, 2pi]
    phi: Latitude (altitude angle) in radians. [0, pi]

  Returns:
    uv: UVS in channel_last format
    idx: Side of the cubemap

  """

  u = np.zeros(theta.shape, dtype=np.float32)
  v = np.zeros(theta.shape, dtype=np.float32)
  side = np.zeros(theta.shape, dtype=np.float32)
  side[:] = -1

  for i in range(0, 4):
    indices = np.logical_or(
        np.logical_and(theta >= i * np.pi / 2 - np.pi / 4, theta <=
                       (i + 1) * np.pi / 2 - np.pi / 4),
        np.logical_and(theta >= i * np.pi / 2 - np.pi / 4 + 2 * np.pi, theta <=
                       (i + 1) * np.pi / 2 - np.pi / 4 + 2 * np.pi))
    u[indices] = np.tan(theta[indices] - i * np.pi / 2)
    v[indices] = 1 / (
        np.tan(phi[indices]) * np.cos(theta[indices] - i * np.pi / 2))
    side[indices] = i + 1
  top_indices = np.logical_or(phi < np.pi / 4, v >= 1)
  u[top_indices] = -np.tan(phi[top_indices]) * np.sin(theta[top_indices] -
                                                      np.pi)
  v[top_indices] = np.tan(phi[top_indices]) * np.cos(theta[top_indices] - np.pi)
  side[top_indices] = 0
  bottom_indices = np.logical_or(phi >= 3 * np.pi / 4, v <= -1)
  u[bottom_indices] = -np.tan(phi[bottom_indices]) * np.sin(
      theta[bottom_indices])
  v[bottom_indices] = -np.tan(phi[bottom_indices]) * np.cos(
      theta[bottom_indices])
  side[bottom_indices] = 5

  assert not np.any(side < 0), "Side less than 0"

  return np.stack(((u + 1) / 2, (-v + 1) / 2), axis=-1), side

Fixed time step

Reference

Normal games run at realtime, i.e. frames are generated as fast as possble (or limited by vsync) and the time between frames set based on the elapsed time. For demos and games, this is fine. However for simulations, you will want to fix the time between frames to a constant value. This allows faster-than-realtime simulations on more powerful hardware and slower-than-realtime simulations on slower hardware.

Below sets the timestep to 1/30 seconds per frame on the server.

./CarlaUE4.sh -benchmark -fps=30

Alternatively, you can do this on the client by modifying the time step setting of your world.

settings = world.get_settings()
settings.fixed_delta_seconds = 0.05
world.apply_settings(settings)

If your client needs to record data every frame, you may want to use #synchronous mode.

Synchronous Mode

Synchronous mode will force the server to wait for world.tick() on the client rather than run as fast as possible.
This is ideal if you need to do something per-frame which causes your client to be slower than your server.

settings = world.get_settings()
settings.fixed_delta_seconds = 1 / 30
settings.synchronous_mode = True # Enables synchronous mode
world.apply_settings(settings)

If you are using the built in traffic manager for autopilot, you will need to set that to synchronous mode as well:

tm = client.get_trafficmanager()
tm.set_synchronous_mode(True)

See spawn_npc.py for an example of synchronous mode.

Troubleshooting

  • trying to create rpc server for traffic manager; but the system failed to create because of bind error on client.get_trafficmanager()
    • kill the program using port 8000 lsof -i :8000

Coordinate System

Carla is based on UE4 which uses a left-handed coordinate system. Specifically, up is +z, forward is +x, and right is +y.

To convert (yaw, pitch, roll) rotations to a right-handed coordinate system, just use (-yaw, pitch, -roll)