Xiaomi presented a bipedal humanoid robot during a launch event for its foldable devices. The CyberOne is capable of recognizing 3D space as well as distinguishing individuals, movements, and facial expressions. The robot is able to understand 45 types of human emotion and also comfort people during times of misery. Xiaomi professes a series of real-world roles for the bot, including things like manufacturing assistance and also human company.
Long since gone are the times when a consumer electronics business could just simply release a smartphone and call it a day. At today’s big launch event in Beijing, Xiaomi followed up its foldable phone news by handing the floor to CyberOne. The biped robot joined Lei Jun on the stage, greeting the CEO and giving him a long-stem flower.
At initial glimpse, the robot isn’t exactly spritely, in terms of movement, however it’s still a good demo and very much not a person in a spandex disguise. It’s newest indicator of Xiaomi’s improving robotics goals, which started out with vacuums and has since expanded to incorporate last year’s Spot like CyberDog.
We have viewed a lot of consumer brands flex their robotic muscle at events like this, like Samsung and also LG, so it is hard to realize where CyberOne falls in the spectrum when comparing serious intent and stage performance.
Lei Jun was quick to point out the company’s financial investment in the sector stating that CyberOne’s AI and mechanical abilities are all self-developed by Xiaomi Robotics Lab. He said the company has spent heavily in research and development covering several sectors, including things like software programs, equipment as well as algorithms development.
— leijun (@leijun) August 11, 2022
There’s an incredibly broad series of assertions here, including the ability to understand human emotions. Xiaomi mentions that humanoid robots rely on vision to process their environments. Outfitted with a self-developed Mi-Sense depth vision module and blended with an AI interaction algorithm, CyberOne is up to the task of perceiving 3D space, as well as distinguishing individuals, actions, as well as facial expressions, permitting it to not only observe but to process its environment. To be able to interact with the world, CyberOne is armed with a self-developed MiAI environment semantics recognition system and also a MiAI vocal emotion identification engine, permitting it to distinguish 85 types of environmental noises and 45 classifications of human emotion. CyberOne is able to identify happiness, as well as even comfort the user throughout of misery. Every one of these characteristics are included into CyberOne’s processing systems, that are joined with a curved OLED module to display real-time interactive info.
Equally broad are the promised real-world uses, ranging from manufacturing aid to human company. Certainly there will be loads of use for both of these of these feature sets in the coming future, but that’s a very long way from this presentation. For the time being, it most probably makes the sense to view CyberOne as something of an analog to, say, Honda’s Asimo: an encouraging experiment that works as a good brand ambassador for much of the development being undertaken in the field.