Universal Robots
Universal Robots (UR) offers a range of collaborative robotic arms in different sizes and with different payloads that are widely adopted across industries and research for their accessibility and flexibility. A graphical interface on the teach pendant allows to easily and intuitively program and integrate UR robots. At the same time, advanced users can get access to the full capabilities of the manipulators through scripting in URScript language or even by developing software add-ons, so-called URCaps.
The ecosystem around UR robots is highly developer-friendly, with open-source communication libraries, drivers, documentation, and integration support for frameworks such as ROS. Additionally, their simulation tool URSim allows developers to test and validate robot programs and interfaces without needing access to physical hardware and makes UR a popular choice building custom applications.
A guide on installing and running URSim can be found on this page.
Due to the reasons mentioned above, UR manipulators are often used for prototyping at AICA and have seen extensive internal development for robot-specific feature integration. The UR hardware collection provided by AICA comes with special tools and functionalities that are unique to UR robots. This guide intends to explain these concepts and how they can be leveraged in AICA Studio.
To use the UR collection, add the latest version of collections/ur-collection to your configuration in AICA Launcher.
Doing this will add multiple new hardware examples as well as a few controllers to AICA Studio. All UR robots share the
same hardware interface in AICA Studio. The hardware interface has quite a large number of parameters, most of which are
not important for regular use cases.
For best results with a UR robot, always set the rate of the hardware interface to 500 Hertz, which corresponds to the control rate of the real hardware.

Local and Remote Control
The concept of Local and Remote Control on PolyScope can be easiest explained by introducing a primary and secondary device architecture. In Local Control, the controller is the primary and has full authority on loading and starting programs. In other words, the robot has to be used in person through the teach pendant and any commands sent from an external source will be rejected. On the other hand, Remote Control allows to control the robot via external sources, such as sockets, I/Os and the Dashboard Server. In this case, the controller is the secondary and external sources can load and start programs or directly send URScript commands to the controller.
Safety features remain active in Remote Control.
Choosing one of the two modes depends on the specific task at hand. During a development phase, it might be preferable to create the programs in Local Mode, whereas in a production setting, PLCs would be responsible to load and start the desired programs while the robot is in Remote Control. With the AICA System, users have the chance to get the best of both modes:
- Take full control of the robot from an AICA application (requires Remote Control)
- Run an AICA application as one node of a program (works in both Local and Remote Control)
The two examples below work out of the box with the URSim.
Full control of the robot from an AICA application
For this first case, no additional installation steps are required. The robot becomes the secondary device and all motions are coordinated through AICA Studio. Apart from setting the correct robot IP in the hardware interface, two requirements have to be met:
- On the robot, Remote Control has to be activated. For that, first activate Remote Control in the system settings as
explained here, then switch from Local to
Remote mode on the top right corner of the teach pendant. The interface automatically switches to the Run tab and
disables other tabs, indicating that control has been handed over to external sources.

- In AICA Studio, make sure that the parameter
Headless Modethat can be found under the hardware interface parameters is set toTrue. This will notify the hardware interface that it will be running headless, i.e. it is in charge of providing the full UR program to the robot controller.
Finally, implement an application of your choice in AICA Studio. An example with a joint trajectory controller is given below. Observe how the robot program status goes from Stopped to Running as soon as the hardware interface connects to the robot.

Example application, remote control
schema: 2-0-4
dependencies:
core: v4.4.2
frames:
wp_1:
reference_frame: world
position:
x: -0.027943
y: 0.600701
z: 0.202217
orientation:
w: 0.171776
x: 0.985056
y: 0.002386
z: -0.012313
wp_2:
reference_frame: world
position:
x: 0.260809
y: 0.604927
z: 0.194871
orientation:
w: 0.132343
x: 0.95897
y: -0.030587
z: -0.248852
wp_3:
reference_frame: world
position:
x: 0.147083
y: 0.552997
z: 0.328354
orientation:
w: 0.012478
x: 0.999843
y: 0.000392
z: -0.012536
on_start:
load:
hardware: hardware
sequences:
sequence:
display_name: Sequence
steps:
- delay: 2
- call_service:
controller: joint_trajectory_controller
hardware: hardware
service: set_trajectory
payload: "{frames: [wp_1, wp_2, wp_3], durations: [1.0, 1.0, 1.0],
blending_factors: [1.0]}"
hardware:
hardware:
display_name: Hardware Interface
urdf: Universal Robots 5e
rate: 500
events:
transitions:
on_load:
load:
- controller: robot_state_broadcaster
hardware: hardware
- controller: joint_trajectory_controller
hardware: hardware
controllers:
robot_state_broadcaster:
plugin: aica_core_controllers/RobotStateBroadcaster
events:
transitions:
on_load:
switch_controllers:
hardware: hardware
activate: robot_state_broadcaster
joint_trajectory_controller:
plugin: aica_core_controllers/trajectory/JointTrajectoryController
events:
predicates:
has_trajectory_succeeded:
application: stop
transitions:
on_load:
switch_controllers:
hardware: hardware
activate: joint_trajectory_controller
graph:
positions:
buttons:
button:
x: -460
y: 600
hardware:
hardware:
x: 620
y: -20
sequences:
sequence:
x: 40
y: 560
buttons:
button:
on_click:
sequence:
start: sequence
edges:
sequence_sequence_event_trigger_2_hardware_hardware_joint_trajectory_controller_set_trajectory:
path:
- x: 420
y: 1060
- x: 620
y: 1060
- x: 620
y: 900
sequence_sequence_event_trigger_1_hardware_hardware_joint_trajectory_controller_set_trajectory:
path:
- x: 240
y: 860
hardware_hardware_joint_trajectory_controller_has_trajectory_succeeded_on_stop_on_stop:
path:
- x: 540
y: 780
- x: 540
y: 480
- x: -20
y: 480
- x: -20
y: 140
Run an AICA application as one node of a program
The second case requires the External Control URCap to be installed. While the robot stays the primary device, the URCap comes with a program node that allows to hand over control to secondary devices during the execution of that program node. This is especially useful for integrating smaller, single purpose AICA applications into bigger, existing cells. Once the AICA application has finished its task, it hands back control to the robot which will continue the execution of the main UR program. To set this up, follow the these steps:
- The external control URCap needs to be configured to the right remote control address. Navigate to the Installation
tab and set the address to the one of the device that will be running the AICA application.

- Insert the
Control by <IP>program node from the External Control URCap at the desired location of a new or existing UR program. The robot remains in local mode.
- Set the
Headless Modein the hardware interfacefalse. - In AICA Studio, the UR Dashboard Controller should be added to the hardware interface. Its
program_runningpredicate notifies that the UR program has arrived at theControl by <IP>node and is ready to receive control commands. After completion of the task in AICA Studio, control is handed back using a service call and the UR program resumes execution. More details about this controller follow in the next section.warningSending motion commands to the robot should exclusively happen while the
program_runningpredicate is true. Activate motion controllers using this predicate and deactivate them upon handing back control.
The example with a joint trajectory controller from above is given here in its Local Control version. Be sure to start the application in AICA Studio first, and the UR program second.
Example application, local mode
schema: 2-0-4
dependencies:
core: v4.4.2
frames:
wp_1:
reference_frame: world
position:
x: -0.027943
y: 0.600701
z: 0.202217
orientation:
w: 0.171776
x: 0.985056
y: 0.002386
z: -0.012313
wp_2:
reference_frame: world
position:
x: 0.260809
y: 0.604927
z: 0.194871
orientation:
w: 0.132343
x: 0.95897
y: -0.030587
z: -0.248852
wp_3:
reference_frame: world
position:
x: 0.147083
y: 0.552997
z: 0.328354
orientation:
w: 0.012478
x: 0.999843
y: 0.000392
z: -0.012536
on_start:
load:
hardware: hardware
sequences:
sequence:
display_name: Sequence
steps:
- delay: 2
- call_service:
controller: joint_trajectory_controller
hardware: hardware
service: set_trajectory
payload: "{frames: [wp_1, wp_2, wp_3], durations: [1.0, 1.0, 1.0],
blending_factors: [1.0]}"
hardware:
hardware:
display_name: Hardware Interface
urdf: Universal Robots 5e
rate: 500
events:
transitions:
on_load:
load:
- controller: robot_state_broadcaster
hardware: hardware
- controller: joint_trajectory_controller
hardware: hardware
- controller: ur_dashboard_controller
hardware: hardware
parameters:
headless_mode: "false"
controllers:
robot_state_broadcaster:
plugin: aica_core_controllers/RobotStateBroadcaster
events:
transitions:
on_load:
switch_controllers:
hardware: hardware
activate: robot_state_broadcaster
joint_trajectory_controller:
plugin: aica_core_controllers/trajectory/JointTrajectoryController
events:
predicates:
has_trajectory_succeeded:
call_service:
controller: ur_dashboard_controller
hardware: hardware
service: hand_back_control
transitions:
on_activate:
sequence:
start: sequence
ur_dashboard_controller:
plugin: aica_ur_controllers/URDashboardController
events:
predicates:
program_running:
switch_controllers:
hardware: hardware
activate: joint_trajectory_controller
hand_back_control_success:
application: stop
transitions:
on_load:
switch_controllers:
hardware: hardware
activate: ur_dashboard_controller
graph:
positions:
hardware:
hardware:
x: 780
y: 0
sequences:
sequence:
x: 80
y: 380
edges:
sequence_sequence_event_trigger_2_hardware_hardware_joint_trajectory_controller_set_trajectory:
path:
- x: 420
y: 1060
- x: 620
y: 1060
- x: 620
y: 900
on_start_on_start_hardware_hardware:
path:
- x: 440
y: 40
- x: 440
y: 60
hardware_hardware_joint_trajectory_controller_on_activate_sequence_sequence:
path:
- x: 20
y: 760
- x: 20
y: 440
sequence_sequence_event_trigger_1_hardware_hardware_joint_trajectory_controller_set_trajectory:
path:
- x: 280
y: 920
hardware_hardware_ur_dashboard_controller_program_running_hardware_hardware_joint_trajectory_controller:
path:
- x: 460
y: 1300
- x: 460
y: 640
hardware_hardware_ur_dashboard_controller_hand_back_control_success_on_stop_on_stop:
path:
- x: -20
y: 1260
- x: -20
y: 140
hardware_hardware_joint_trajectory_controller_has_trajectory_succeeded_hardware_hardware_ur_dashboard_controller_hand_back_control:
path:
- x: 680
y: 840
- x: 680
y: 1380