IHPC Tech Hub

IHPC Tech Hub showcases IHPC's suite of in-house developed applications, tools or technology to help you unlock the possibilities to overcome business challenges. Through providing valuable insights, you can now predict and shape the commercial outcomes, automate processes, and free up resources for repetitive and labour-intensive tasks. 

Discover the power of computational modelling, simulation and AI that brings about positive impact to your business. 

Human-Robot Collaborative Artificial Intelligence (Collab AI)

The Human-Robot Collaborative Artificial Intelligence (Collab AI)programme aims to enable robots to perform as team members alongside humans by understanding tasks, procedures and human actions, as well as learning quickly and robustly.  The goal is to have robots work together with humans in complex and dynamic tasks that cannot be easily automated, but where robots can still assist humans in doing parts of the task that may be ergonomically difficult or dangerous for humans, or require skills that robots are good at, such as performing repetitive actions with high precision.

Moreover, because of limitations in robot sensing and intelligence, specialised set-ups are currently required for robots to be deployed with humans. Collab AI seeks instead to develop technologies that enable robots to adapt to humans and work safely in human environments, for example: 
  • Learning to recognise new objects through visual self-exploration 
  • Learning new tasks by observing human demonstration and teaching, which allows non-experts to work with robots without needing to explicitly program them
  • Collaborating naturally with humans on tasks with visual & tactile guidance, natural speech/dialogue interaction and AR interface
In this way, Collab AI technologies help to reduce cost and increase the flexibility of robot deployments, making them suitable for high-mix, low-volume (HMLV) use cases, such as hyper-personalised manufacturing. 

Features 

The Collab AI programme is developing a human-robot collaborative system solution that:
  • Enables natural collaboration with humans
  • Integrates multi-modal perception and dialogue capability
  • Leverages commonsense knowledge
  • Uses a cognitive architecture as a basis for integration
  • Stays agnostic to the robotic platform
  • Supports learning from instructions and/or demonstrations

The Science Behind

Cognitive Architecture



A cognitive architecture (CA) implements a principled and cognitively inspired theory of robotic intelligence that provides capabilities for: 

  • Maintaining common ground on task and human states (working/episodic memory)
  • Reasoning about percepts and goals (inference)
  • Planning and executing actions (problem-solving, skill retrieval)
  • Learning from experience (skill learning)
  • Integrates lower-level robotics capabilities
  • Allows for high-level approaches to Task Planning and Behaviour Generation which,
    • Is Goal-driven rather than State-driven
    • Allows for abstract skill specification instead of hardware-specific programmes
  • More intelligently handles exceptions
  • Provides significant benefits like reusability and generality compared to ad-hoc robotic architectures

    [Watch the video] The ability of the CA for maintaining object permanence (i.e., remembering/tracking objects that go out of visual view).

    Programming-Free New Object Registration

    Collab AI developed a new method to teach the robot to recognise new objects through visual self-exploration. This method is called “Teaching with Active and Incremental Learning for Object Registration (TAILOR)”.

    About TAILOR: 

    • Provides an interactive process for training a deep-learning based object detector
    • Alleviate the need for costly and tedious data collection and annotation
    • Allow non-expert users to teach the system to recognise new objects without the need for programming

    Teaching by Demonstration

    CHRIS by I2R


    Collab AI developed a Collaborative Human-Robot Intelligent System (CHRIS) which allows a non-expert user to automatically program a robot through teaching demonstration.

    • Teach and interact with robots in natural and efficient human-like manner
    • Teach robots to learn task procedural knowledge from human demonstration
    • Transfer learned skills to new tasks and robot execution with minimum prior knowledge

    CHRIS was demonstrated at 2021 Hannover Fair Messe and was a Top 5 finalist for the Kuka Innovation Award 2021.

    Robust Skill Learning



    Collab AI developed a general key-point based end-to-end learning framework for skill learning for low-level tasks:

    • Zero-shot sim-to-real transfer
    • No camera calibration is needed
    • Tolerant to noisy workplace
    • Results:
      o Shows good task generalisation
      o Achieved good success rate and training efficiency: Outperforms the baseline methods with AE-based and VAE-based 

     

    Natural Interaction via Voice



    Collab AI developed capabilities for natural dialogue interaction which allows humans to learn from robots by asking questions, and querying the system about tasks, objects and robot status.

    • Reduce training costs for new team members, by answering basic questions from them
    • Reduce time for switching between hands-on work and machine monitoring, where robots can answer questions about machine states
    • Achieve high accuracy with a small dataset of dialogue Q&As


      Industry Applications

      Collab AI technologies can be applied to complex tasks that cannot be fully automated and require human inputs to allow robots to be deployed more effectively to augment humans. Other possible applications include work-cells for hyper-customised manufacturing with easy process adaptation through robot teaching.

      Currently, Collab AI technologies are being trialed in:
      • Collaborative automated programming for testing consumer products
      • Collaborative harvesting of edible plants

      #Collab AI is an AME Programmatic Programme led by IHPC in collaboration with I2R, ARTC, NUS, NTU and SUTD.