{"id":24388,"date":"2025-07-14T10:00:50","date_gmt":"2025-07-14T10:00:50","guid":{"rendered":"https:\/\/metaverseplanet.net\/blog\/?p=24388"},"modified":"2026-01-06T13:02:46","modified_gmt":"2026-01-06T13:02:46","slug":"avatar-movie-becomes-reality","status":"publish","type":"post","link":"https:\/\/metaverseplanet.net\/blog\/avatar-movie-becomes-reality\/","title":{"rendered":"&#8220;Avatar&#8221; Movie Becomes Reality! Japan Develops Capsule for Robot Control via Muscle Movements"},"content":{"rendered":"\n<p>Sometimes, what we see in science fiction movies suddenly becomes reality. A new project in Japan has created just such an impact. A company called <strong>H2L<\/strong> has developed a capsule that allows for remote control of a robot by reading human muscle movements. With this system, robots no longer move based on a screen interface; they move with your body. This means it&#8217;s now possible not just to watch, but to literally <em>be inside<\/em> the robot.<\/p>\n\n\n\n<p>The capsule developed by H2L utilizes a truly extraordinary technology. <strong>Sensors<\/strong> detect even the slightest muscle movements in the user&#8217;s body, enabling immediate remote control of a robot. A simple flick of your wrist or a subtle contraction in your calf can cause the robot to walk, carry an object, or open a door. Essentially, it makes you feel as though your own body has been instantly <strong>teleported<\/strong> to another location.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/s-4edf6d97de3e69bc1e673cbca38a51d6604b7c5e-1024x576.jpg\" alt=\"\" class=\"wp-image-24390\" srcset=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/s-4edf6d97de3e69bc1e673cbca38a51d6604b7c5e-1024x576.jpg 1024w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/s-4edf6d97de3e69bc1e673cbca38a51d6604b7c5e-300x169.jpg 300w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/s-4edf6d97de3e69bc1e673cbca38a51d6604b7c5e-768x432.jpg 768w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/s-4edf6d97de3e69bc1e673cbca38a51d6604b7c5e-390x220.jpg 390w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/s-4edf6d97de3e69bc1e673cbca38a51d6604b7c5e-150x84.jpg 150w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/s-4edf6d97de3e69bc1e673cbca38a51d6604b7c5e.jpg 1200w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>Thanks to the integrated screen and speakers, you can simultaneously perceive everything the robot sees and hears. This goes far beyond simply guiding a robot with a joystick; it&#8217;s now about <strong>physical integration<\/strong>! H2L plans to push this technology even further.<\/p>\n\n\n\n<p>The company is currently developing a system called <strong>&#8220;proprioceptive feedback.&#8221;<\/strong> This system will allow you to feel the pressure and weight of objects the robot touches in your own physical body. For example, if the robot lifts a box, your arm will feel resistance as if it were carrying that weight.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"536\" src=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/s-b0e697ad27a41b6ff3894bc2a259043678790bee-1024x536.jpg\" alt=\"\" class=\"wp-image-24389\" srcset=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/s-b0e697ad27a41b6ff3894bc2a259043678790bee-1024x536.jpg 1024w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/s-b0e697ad27a41b6ff3894bc2a259043678790bee-300x157.jpg 300w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/s-b0e697ad27a41b6ff3894bc2a259043678790bee-768x402.jpg 768w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/s-b0e697ad27a41b6ff3894bc2a259043678790bee-150x79.jpg 150w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/s-b0e697ad27a41b6ff3894bc2a259043678790bee.jpg 1200w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>The capsule is not yet available for general sale, but its price is already known: approximately <strong>30 million yen<\/strong>, or about <strong>$208,000 USD<\/strong>. Therefore, it&#8217;s currently designed more for researchers, technology investors, and private users interested in new technologies. Although the price is high, it&#8217;s possible that, much like early smartphones, this technology could become more accessible in the future.<\/p>\n\n\n\n<p>While the technology is currently expensive and limited, many experts believe it could revolutionize future systems in <strong>work, healthcare, and entertainment<\/strong>. What potential applications do you envision for a system like this?<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">You Might Also Like;<\/h3>\n\n\n<ul class=\"wp-block-latest-posts__list wp-block-latest-posts\"><\/ul>","protected":false},"excerpt":{"rendered":"<p>Sometimes, what we see in science fiction movies suddenly becomes reality. A new project in Japan has created just such an impact. A company called H2L has developed a capsule that allows for remote control of a robot by reading human muscle movements. With this system, robots no longer move based on a screen interface; &hellip;<\/p>\n","protected":false},"author":1,"featured_media":24391,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAown96uCw:productID":"","footnotes":""},"categories":[336,119],"tags":[337,345,340],"class_list":["post-24388","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-futurescience","category-robotic","tag-future-energy","tag-robot-news","tag-science-news"],"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts\/24388","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/comments?post=24388"}],"version-history":[{"count":0,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts\/24388\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/media\/24391"}],"wp:attachment":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/media?parent=24388"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/categories?post=24388"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/tags?post=24388"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}