Preparation for Halloween Jam ’20

Well this is an exciting time for RoboRabbit-Labs! The first electro-craft workshops will be happening this week, with a Halloween theme!

Why Halloween?

Well 2020 has been a challenging and even dark year for many. With the full moon on Halloween this year, its probably a good time to sweep out the old and usher in the light and good vibes. Halloween of course provides a great opportunity to allow fun into our lives and offers some iconic characters to play with like Jack-o-Lanterns of course, but also black cats, mummies, zombies, werewolves, ghosts and witches. It’s a time when we get to play with these scary themes and restore some balance to our world.

Felted Jack-o-Lantern

The Jack o Lantern came together as a project pretty quickly. I used the idea behind the Octopus, and simplified it by removing the controller board, the LilyTiny. I decided to make it more widely appealing by hiding the electronic components behind the eyes and mouth, and making the conductive thread lines blend in with the embroidery pumpkin grooves.

The test session

My lovely assistant and boyfriend, robotics and electronics enthusiast Renze, donated two hours of his weekend to help me test the Video conferencing and assembly process. The most challenging part for him was learning to sew! Details like threading the needles, and the tendency of the conductive thread to get kinks delayed his efforts. However, he visibly enjoyed making something that worked and lit up in the end, and was impressed that he could actually make a successful product, despite how foreign the construction was. Also, he now has a cute ornament of his own making to keep him company in his office.

Posting the kits

Today I will send the last of the kits out for this week’s sessions. It’s incredibly exciting to fill the padded envelopes, stick the addresses on and perform the last checks for completeness. It feels like sending presents out, which I love doing. I hope I can include more little gifts in future kits. This time I was able to include a second battery so that participants can make their own project with the remaining 3 LEDs and conductive thread.

ERF2019 Robotics and AI

ERF2019 took place at the Marriott hotel in Bucharest. As usual, the event was divided into workshops and an exhibition area with different robot-related organisations represented, including European organisations, robot and parts manufacturers, technology hubs, universities and governmental institutions. Check out my post on ERF2017 here.

The main topics for this year included:

  • Robotics and AI
  • Robotics in industry, logistics and transport
  • Collaborative robots
  • Ethics, liability, safety, standardization
  • Marine, aerial, space, wearable robotics

Robotics and AI

ERF2019 and ERF2017 were miles apart in terms of the awareness of AI. The EU has identified AI as a key area to remain competitive with the US and China, and have allocated a large amount of funding to AI. Lighthouse domains for investment include agrifood, inspection and maintenance and healthcare.

They seek to build partnerships across Europe, identify the key players and increase synergies between member states. They have setup a collaboration with the Big Data Value Association to create a strategy which will make AI development for robotics thrive.

AI has been identified as a way to get robots out of the cage and interacting directly with people. They need to be easier to configure, learn, program, use. It should not take weeks of training to introduce robots into health facilities, for instance. Robotics is AI embodied and brings its own challenges like application of learning and algorithms in real life chaotic and unpredictable environments. What are the value exchange points between AI and robotics? Can robots create a feedback loop to help tune AI algorithms? We need data to take the tech forward – and we need to be able to apply data and learnings across domains and applications. We also need to ensure that users and companies understand what their data is being used for and are comfortable with the contribution they are making.


AI4EU is a Horizon 2020 project which will create a platform to encourage AI sharing, gather resources, algorithms, datasets and AI knowledge in the EU. There are 3 open calls for startups, SME’s and AI talent, plus a technology transfer program. It will be the focal point in Europe for all AI resources. There will also be a search engine with a knowledge graph optimised to help search for ai-specific resources.

The platform will be as useful as we, the contributors make it. Especially in robotics we need to actively contribute and drive the content, because there is such a long way to go to independent robots. Submissions must be of high quality – safe, effective, rated by the user community, and fully specified. This could also result in financial remuneration for contributors. It will have two sandbox environments to make it easier to develop AI. A methodology will be provided to design AI components respecting EU values – inclusive and gender neutral.

The project started on 1 January 2019, and will deliver version 0 at the end of June 2019, V1 January 2020, V2 January 2021 and V3 December 2021. Industry will be represented by companies such as SAP and Siemens.


I noticed that there were few startups represented at the ERF. There is a startup competition, but I haven’t heard startups represented in the workshops where we share ideas. My guess is that it’s too expensive for startups to attend without providing immediate commercial value. But I also wonder if it’s because the pace of contribution of the EU projects is too slow for startups – all the projects take years to months and the outcomes are usually on too high a level to be used immediately. I think the EU can only harness the power of innovation in AI and Robotics by including the risk taking startup innovation layer and making it easier for us to understand and consume the vast amounts of research done. Then we need to be able to give feedback and improve the entire cycle together. As a startup, projects like this should be helping us to accelerate our development. But on the other hand we should be getting involved and demanding representation so that our needs can be addressed.

Robot app prototyping on Anki Vector

This past weekend, Vikram Radhakrishnan, Lukas Jelinek and myself, Thosha Moodley got together to try some robot app prototyping on the Anki Vector. The Vector is an adorable robot created by Anki, and a follow up to their first model, Cozmo. We are using it to produce a robot app prototype as a proof of concept.

Vikram, Lukas and Thosha with Anki Vector

Learning on the Vector has been great as it’s easy to get started and pretty cute. The main hassle is that the API requires Python 3.6 and not everything we want to work with is 3.6-ready. Also the API is in alpha state so still some incomplete features and bugs.

Our use case is based on the idea of a house sitter app which runs on the Vector or other robot hardware and helps keep an eye on important locations in the home. To get this working we had to get familiar with Vector’s camera and mapping features.

Anki Vector creating a map by exploring

Vector builds a map by exploring the surrounding space. In the picture above you can see different colour codes on the map that represent what Vector knows about that space:

  • Dark grey – unknown
  • Green – no obstacles
  • Dark green – cliff
  • Red – obstacle cube
  • Orange – obstacle proximity
  • Yellow-green – obstacle proximity explored
  • Dark red – obstacle unrecognised
  • Black – cliff
  • Yellow – interesting edge
  • Dark yellow – non-interesting edge

Vector also has a remote_control app in the code samples – watch Vikram controlling it here:

Well, that’s all for now, be sure to stay tuned for updates on our robot application mission!

What I did today – steps on the cleaning robot


The first usecase that I am focusing on is cleaning the bathroom counter and sink. To prove that this would work, I used a little motor with a disc on the end, covered in cleaning wipes. The motor turned the cleaning pad to clean a surface.

Next I designed a basic schematic of a cleaning arm:

Screenshot 2018-11-09 22.52.50

I broke the building of the arm into experiments, the first of which uses a linear actuator, Arduino Uno, distance sensor and pressure sensor (fsr). The (vertical) actuator moves downward on a guide until it gets close to the counter. It keeps extending slowly until the pressure sensor detects the desired level of force to be able to clean the counter.


Today I dismantled this setup to add it onto an aluminium frame, onto an arm I have been working on for some time.

The arm is composed of a the horizontal linear actuator that I made with v-slot format aluminium extrusions and a nema 17 stepper motor. The linear actuator moves a gantry on which the vertical linear actuator is mounted.

I have two stepper motors which I’m trying to attach to the arm to rotate the cleaning head and the arm itself, as seen in the schematic.


My new tool organiser – highlight of the day! That’s all for today – until next time.

Tony Belpaeme’s work on educational robotics

This past Friday I spoke to esteemed Human Robot Interaction Professor Angelo Cangelosi.

Angelo’s work is centred around human robot interaction, developmental robotics, robot understanding of language and real world concepts, and the use of neuromorphic systems for robot learning. He is the coordinator of the Horizon 2020 APRIL program (Applications of Personal Robotics through Interaction and Learning) amongst many other responsibilities in the robotics space. My Roborabbit activities allow me to meet truly amazing people!

Tony Belpaeme’s work on educational robots

Angelo gave me some great leads for me to research. The first was the work of Tony Belpaeme, professor at Gent and Plymouth Universities. The paper Social robots for education: A review (Belpaeme, Kennedy, Ramachandran, Scassellati and Tanaka, 2018) does a an analysis and collation of studies on the use of robots in children’s education. It examines the research from the perspective of robots used as teacher, peer and novice (in which the robot is taught by the child). Robots are helpful in an educational setting to aid in learning, having an effectiveness almost equal to that of human beings. Embodiment, meaning that the robot, unlike a virtual assistant, has a body, helps in learning by causing social reactions in humans that help in learning. Robots could allow for more enriched and individualised learning – these are challenges in the modern classroom.

“…physical robots have enhanced learning and affected later behavioral choice more substantially than virtual agents. Compared with instructions from virtual characters, videos of robots, or audio only lessons, robots have produced more rapid learning in cognitive puzzles…

There are many challenges to using a robot for education including:

  • Robustness of speech recognition for children
  • Other means of input like tablet input creating a disconnect
  • Building an understanding of learning progress in the robot
  • Choosing a next activity to challenge and engage the child
  • Knowing when to switch activities to engage the child
  • Generating a combination of verbal and nonverbal output and coordinating those
  • The question how far we want the teaching of our children to be delegated to robots
  • The question if we are restricting child development based on the capabilities of the technologies and not giving children what they really need

These are questions that Roborabbit will need to answer when we enter the social robot educational applications space. The paper also raised the idea that robots could be used in adult education. This could be an interesting avenue for investigation for the future.




Blog at

Up ↑