TY - JOUR
T1 - Introducing geometric constraint expressions into robot constrained motion specification and control
AU - Borghesan, Gianni
AU - Scioni, Enea
AU - Kheddar, Abderrahmane
AU - Bruyninckx, Herman
PY - 2016/7/1
Y1 - 2016/7/1
N2 - The problem of robotic task definition and execution was pioneered by Mason, who defined setpoint constraints where the position, velocity, and/or forces are expressed in one particular task frame for a 6-DOF robot. Later extensions generalized this approach to constraints in 1) multiple frames; 2) redundant robots; 3) other sensor spaces such as cameras; and 4) trajectory tracking. Our work extends tasks definition to 1) expressions of constraints, with a focus on expressions between geometric entities (distances and angles), in place of explicit set-point constraints; 2) a systematic composition of constraints; 3) runtime monitoring of all constraints (that allows for runtime sequencing of constraint sets via, for example, a Finite State Machine); and 4) formal task descriptions, that can be used by symbolic reasoners to plan and analyse tasks. This means that tasks are seen as ordered groups of constraints to be achieved by the robot's motion controller, possibly with different set of geometric expressions to measure outputs, which are not controlled, but are relevant to assess the task evolution. Those monitored expressions may result in events that trigger switching to another ordered group of constraints to execute and monitor. For these task specifications, formal language definitions are introduced in the JSON-schema modeling language.
AB - The problem of robotic task definition and execution was pioneered by Mason, who defined setpoint constraints where the position, velocity, and/or forces are expressed in one particular task frame for a 6-DOF robot. Later extensions generalized this approach to constraints in 1) multiple frames; 2) redundant robots; 3) other sensor spaces such as cameras; and 4) trajectory tracking. Our work extends tasks definition to 1) expressions of constraints, with a focus on expressions between geometric entities (distances and angles), in place of explicit set-point constraints; 2) a systematic composition of constraints; 3) runtime monitoring of all constraints (that allows for runtime sequencing of constraint sets via, for example, a Finite State Machine); and 4) formal task descriptions, that can be used by symbolic reasoners to plan and analyse tasks. This means that tasks are seen as ordered groups of constraints to be achieved by the robot's motion controller, possibly with different set of geometric expressions to measure outputs, which are not controlled, but are relevant to assess the task evolution. Those monitored expressions may result in events that trigger switching to another ordered group of constraints to execute and monitor. For these task specifications, formal language definitions are introduced in the JSON-schema modeling language.
KW - Behaviour-Based Systems
KW - Middleware and Programming Environments
KW - Motion and Path Planning
KW - Software
UR - http://www.scopus.com/inward/record.url?scp=85061424275&partnerID=8YFLogxK
U2 - 10.1109/LRA.2015.2506119
DO - 10.1109/LRA.2015.2506119
M3 - Article
AN - SCOPUS:85061424275
SN - 2377-3766
VL - 1
SP - 1140
EP - 1147
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 2
M1 - 7348670
ER -