Introducing geometric constraint expressions into robot constrained motion specification and control

Gianni Borghesan, Enea Scioni, Abderrahmane Kheddar, Herman Bruyninckx

Onderzoeksoutput: Bijdrage aan tijdschriftTijdschriftartikelAcademicpeer review

3 Citaties (Scopus)

Uittreksel

The problem of robotic task definition and execution was pioneered by Mason, who defined setpoint constraints where the position, velocity, and/or forces are expressed in one particular task frame for a 6-DOF robot. Later extensions generalized this approach to constraints in 1) multiple frames; 2) redundant robots; 3) other sensor spaces such as cameras; and 4) trajectory tracking. Our work extends tasks definition to 1) expressions of constraints, with a focus on expressions between geometric entities (distances and angles), in place of explicit set-point constraints; 2) a systematic composition of constraints; 3) runtime monitoring of all constraints (that allows for runtime sequencing of constraint sets via, for example, a Finite State Machine); and 4) formal task descriptions, that can be used by symbolic reasoners to plan and analyse tasks. This means that tasks are seen as ordered groups of constraints to be achieved by the robot's motion controller, possibly with different set of geometric expressions to measure outputs, which are not controlled, but are relevant to assess the task evolution. Those monitored expressions may result in events that trigger switching to another ordered group of constraints to execute and monitor. For these task specifications, formal language definitions are introduced in the JSON-schema modeling language.

TaalEngels
Artikelnummer7348670
Pagina's1140-1147
Aantal pagina's8
TijdschriftIEEE Robotics and Automation Letters
Volume1
Nummer van het tijdschrift2
DOI's
StatusGepubliceerd - 1 jul 2016

Vingerafdruk

Robots
Specifications
Formal languages
Finite automata
Robotics
Cameras
Trajectories
Controllers
Monitoring
Sensors
Chemical analysis
Modeling languages

Trefwoorden

    Citeer dit

    @article{19821d333fa6402ba8786ebd27f86738,
    title = "Introducing geometric constraint expressions into robot constrained motion specification and control",
    abstract = "The problem of robotic task definition and execution was pioneered by Mason, who defined setpoint constraints where the position, velocity, and/or forces are expressed in one particular task frame for a 6-DOF robot. Later extensions generalized this approach to constraints in 1) multiple frames; 2) redundant robots; 3) other sensor spaces such as cameras; and 4) trajectory tracking. Our work extends tasks definition to 1) expressions of constraints, with a focus on expressions between geometric entities (distances and angles), in place of explicit set-point constraints; 2) a systematic composition of constraints; 3) runtime monitoring of all constraints (that allows for runtime sequencing of constraint sets via, for example, a Finite State Machine); and 4) formal task descriptions, that can be used by symbolic reasoners to plan and analyse tasks. This means that tasks are seen as ordered groups of constraints to be achieved by the robot's motion controller, possibly with different set of geometric expressions to measure outputs, which are not controlled, but are relevant to assess the task evolution. Those monitored expressions may result in events that trigger switching to another ordered group of constraints to execute and monitor. For these task specifications, formal language definitions are introduced in the JSON-schema modeling language.",
    keywords = "Behaviour-Based Systems, Middleware and Programming Environments, Motion and Path Planning, Software",
    author = "Gianni Borghesan and Enea Scioni and Abderrahmane Kheddar and Herman Bruyninckx",
    year = "2016",
    month = "7",
    day = "1",
    doi = "10.1109/LRA.2015.2506119",
    language = "English",
    volume = "1",
    pages = "1140--1147",
    journal = "IEEE Robotics and Automation Letters",
    issn = "2377-3766",
    publisher = "Institute of Electrical and Electronics Engineers",
    number = "2",

    }

    Introducing geometric constraint expressions into robot constrained motion specification and control. / Borghesan, Gianni; Scioni, Enea; Kheddar, Abderrahmane; Bruyninckx, Herman.

    In: IEEE Robotics and Automation Letters, Vol. 1, Nr. 2, 7348670, 01.07.2016, blz. 1140-1147.

    Onderzoeksoutput: Bijdrage aan tijdschriftTijdschriftartikelAcademicpeer review

    TY - JOUR

    T1 - Introducing geometric constraint expressions into robot constrained motion specification and control

    AU - Borghesan,Gianni

    AU - Scioni,Enea

    AU - Kheddar,Abderrahmane

    AU - Bruyninckx,Herman

    PY - 2016/7/1

    Y1 - 2016/7/1

    N2 - The problem of robotic task definition and execution was pioneered by Mason, who defined setpoint constraints where the position, velocity, and/or forces are expressed in one particular task frame for a 6-DOF robot. Later extensions generalized this approach to constraints in 1) multiple frames; 2) redundant robots; 3) other sensor spaces such as cameras; and 4) trajectory tracking. Our work extends tasks definition to 1) expressions of constraints, with a focus on expressions between geometric entities (distances and angles), in place of explicit set-point constraints; 2) a systematic composition of constraints; 3) runtime monitoring of all constraints (that allows for runtime sequencing of constraint sets via, for example, a Finite State Machine); and 4) formal task descriptions, that can be used by symbolic reasoners to plan and analyse tasks. This means that tasks are seen as ordered groups of constraints to be achieved by the robot's motion controller, possibly with different set of geometric expressions to measure outputs, which are not controlled, but are relevant to assess the task evolution. Those monitored expressions may result in events that trigger switching to another ordered group of constraints to execute and monitor. For these task specifications, formal language definitions are introduced in the JSON-schema modeling language.

    AB - The problem of robotic task definition and execution was pioneered by Mason, who defined setpoint constraints where the position, velocity, and/or forces are expressed in one particular task frame for a 6-DOF robot. Later extensions generalized this approach to constraints in 1) multiple frames; 2) redundant robots; 3) other sensor spaces such as cameras; and 4) trajectory tracking. Our work extends tasks definition to 1) expressions of constraints, with a focus on expressions between geometric entities (distances and angles), in place of explicit set-point constraints; 2) a systematic composition of constraints; 3) runtime monitoring of all constraints (that allows for runtime sequencing of constraint sets via, for example, a Finite State Machine); and 4) formal task descriptions, that can be used by symbolic reasoners to plan and analyse tasks. This means that tasks are seen as ordered groups of constraints to be achieved by the robot's motion controller, possibly with different set of geometric expressions to measure outputs, which are not controlled, but are relevant to assess the task evolution. Those monitored expressions may result in events that trigger switching to another ordered group of constraints to execute and monitor. For these task specifications, formal language definitions are introduced in the JSON-schema modeling language.

    KW - Behaviour-Based Systems

    KW - Middleware and Programming Environments

    KW - Motion and Path Planning

    KW - Software

    UR - http://www.scopus.com/inward/record.url?scp=85061424275&partnerID=8YFLogxK

    U2 - 10.1109/LRA.2015.2506119

    DO - 10.1109/LRA.2015.2506119

    M3 - Article

    VL - 1

    SP - 1140

    EP - 1147

    JO - IEEE Robotics and Automation Letters

    T2 - IEEE Robotics and Automation Letters

    JF - IEEE Robotics and Automation Letters

    SN - 2377-3766

    IS - 2

    M1 - 7348670

    ER -