{"id":43764,"date":"2026-04-27T07:27:32","date_gmt":"2026-04-27T07:27:32","guid":{"rendered":"https:\/\/metaverseplanet.net\/blog\/?p=43764"},"modified":"2026-04-27T07:27:34","modified_gmt":"2026-04-27T07:27:34","slug":"the-robotaxi-that-thinks-like-a-human","status":"publish","type":"post","link":"https:\/\/metaverseplanet.net\/blog\/the-robotaxi-that-thinks-like-a-human\/","title":{"rendered":"Inside Geely&#8217;s Eva Cab: The Robotaxi That Thinks Like a Human"},"content":{"rendered":"\n<p>I spend a massive amount of time analyzing autonomous vehicle technologies, and to be completely honest, a lot of the announcements start to blend together after a while. Usually, it&#8217;s just a traditional car with a bulky sensor pod bolted to the roof and a minor software update. But when I was digging into the specs of Geely&#8217;s newly unveiled <strong>Eva Cab<\/strong> prototype, it genuinely caught my attention.<\/p>\n\n\n\n<p>We are looking at China&#8217;s first entirely domestic robotaxi, and the crucial difference here is in its DNA. It wasn&#8217;t built for a human and then adapted for a computer; it was engineered from day one with an artificial intelligence core designed to perceive, predict, and react exactly like a seasoned driver.<\/p>\n\n\n\n<p>Let&#8217;s break down why the Eva Cab is a massive leap forward for autonomous mobility and what it means for the future of our daily commutes.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Beyond Retrofits: A Purpose-Built Autonomous Machine<\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"720\" height=\"405\" src=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2026\/04\/indir-1-4.avif\" alt=\"\" class=\"wp-image-43765\" srcset=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2026\/04\/indir-1-4.avif 720w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2026\/04\/indir-1-4-300x169.avif 300w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2026\/04\/indir-1-4-390x220.avif 390w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2026\/04\/indir-1-4-150x84.avif 150w\" sizes=\"(max-width: 720px) 100vw, 720px\" \/><\/figure>\n\n\n\n<p>Most self-driving cars on the road today are what we call &#8220;retrofitted.&#8221; They are standard production vehicles modified to drive themselves. The Eva Cab throws that playbook out the window.<\/p>\n\n\n\n<p>By designing the vehicle specifically for autonomous ride-hailing (robotaxi services), Geely has managed to deeply integrate the software and hardware. There is no compromise between human ergonomics and machine efficiency. The result is a vehicle that doesn&#8217;t just &#8220;see&#8221; the road\u2014it understands the flow of traffic on a cognitive level.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">The Brains: Processing Power That Rivals Supercomputers<\/h3>\n\n\n\n<p>When I looked at the computing hardware driving the Eva Cab, I was seriously impressed. It isn&#8217;t just about having good cameras; it\u2019s about how fast the car can make sense of what those cameras see.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>The Step 3.5 Model:<\/strong> This is the heavy lifter. Boasting a staggering <strong>196 billion parameters<\/strong>, it gives the vehicle a deep, contextual understanding of its environment.<\/li>\n\n\n\n<li><strong>H9 Autonomous Driving System:<\/strong> This system pushes up to <strong>1,400 TOPS<\/strong> (Trillions of Operations Per Second) of processing power.<\/li>\n\n\n\n<li><strong>Lightning-Fast Inference:<\/strong> The car achieves an inference speed of <strong>350 TPS<\/strong> (Transactions Per Second).<\/li>\n<\/ul>\n\n\n\n<p><strong>What does this mean for you?<\/strong> It means the Eva Cab processes environmental data and makes critical decisions <strong>three times faster<\/strong> than a human driver. No matter how much caffeine you&#8217;ve had or how good your reflexes are, you simply cannot react as fast as this machine.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">World Action Model (WAM): Driving with &#8220;Intuition&#8221;<\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"720\" height=\"540\" src=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2026\/04\/indir-11.avif\" alt=\"\" class=\"wp-image-43766\" srcset=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2026\/04\/indir-11.avif 720w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2026\/04\/indir-11-300x225.avif 300w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2026\/04\/indir-11-150x113.avif 150w\" sizes=\"(max-width: 720px) 100vw, 720px\" \/><\/figure>\n\n\n\n<p>The biggest hurdle for self-driving cars has always been the &#8220;edge cases&#8221;\u2014those weird, unpredictable moments that don&#8217;t fit neatly into a programming rulebook. This is where Geely&#8217;s <strong>World Action Model (WAM)<\/strong> completely changes the game.<\/p>\n\n\n\n<p>Older autonomous systems use a linear &#8220;perceive-then-decide&#8221; framework. It&#8217;s clunky. WAM, on the other hand, operates as a continuous, closed loop. It seamlessly blends overarching route planning with split-second tactical decisions.<\/p>\n\n\n\n<p>Because of this human-like processing loop, the Eva Cab can handle <strong>99% of daily driving scenarios<\/strong>, including the stuff that usually breaks AI:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Unmarked, chaotic dirt roads in rural areas.<\/li>\n\n\n\n<li>Manual toll booths with confusing lane merges.<\/li>\n\n\n\n<li>Aggressive urban traffic where the &#8220;rules&#8221; are treated more like suggestions.<\/li>\n<\/ul>\n\n\n\n<p>Instead of freezing up when the lines on the road disappear, the Eva Cab relies on its WAM-driven intuition to navigate like a local taxi driver who knows the streets by heart.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">A 360-Degree Shield: Hardware Meets AI<\/h2>\n\n\n\n<p>A smart brain is useless without sharp senses. Geely has equipped the Eva Cab with a sensor suite and physical chassis that respond instantly to the AI&#8217;s commands.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">The Sensor Suite<\/h3>\n\n\n\n<p>The vehicle utilizes a sophisticated three-layer detection system powered by <strong>43 distinct sensors<\/strong>, including advanced LiDAR and high-resolution cameras. This setup creates a flawless 360-degree digital twin of the world around the car. It constantly tracks pedestrians, erratic vehicles, and unexpected debris.<\/p>\n\n\n\n<p>Geely&#8217;s rigorous testing shows that this system achieves a <strong>95% success rate<\/strong> in executing highly complex urban maneuvers.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">The AI-Powered Digital Chassis<\/h3>\n\n\n\n<p>This is the feature that amazed me the most. The Eva Cab features a digital chassis with a reaction time of just <strong>4 milliseconds<\/strong>.<\/p>\n\n\n\n<p>Think about that. If a child chases a ball into the street, the AI detects it, calculates the physics, and the chassis begins executing a physical evasive maneuver in 4 milliseconds. Geely is fundamentally shifting vehicle safety from <em>passive protection<\/em> (like airbags that deploy when you crash) to <em>active evasion<\/em> (preventing the crash from happening in the first place).<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">The Road to 2027: Redefining Urban Mobility<\/h2>\n\n\n\n<p>Geely isn&#8217;t building this in a vacuum. They are developing the Eva Cab in partnership with <strong>Caocao Mobility<\/strong>, a major ride-hailing player, with a strict timeline aiming for mass production and commercial robotaxi rollout by <strong>2027<\/strong>.<\/p>\n\n\n\n<p>This isn&#8217;t a pipe dream; it&#8217;s a scheduled deployment. They are planning a gradual, phased transition from heavy real-world testing environments directly into a fully driverless, commercial fleet. We are looking at a near future where ordering an Eva Cab is as normal as ordering food on your phone.<\/p>\n\n\n\n<p>When we look at the raw computing power, the intuitive World Action Model, and a chassis that reacts in milliseconds, it&#8217;s clear that Geely isn&#8217;t just trying to make a car that drives itself. They are trying to build a driver that is objectively better, safer, and faster than we could ever be.<\/p>\n\n\n\n<p>I&#8217;m incredibly excited to see this hit the streets, but I&#8217;m curious about where you stand. <strong>If an Eva Cab pulled up to you today, would you feel comfortable taking a nap in the back seat while it navigates chaotic rush-hour traffic, or are you still holding onto the desire for a physical steering wheel just in case?<\/strong><\/p>\n\n\n\n<h3 class=\"wp-block-heading\">You Might Also Like;<\/h3>\n\n\n<ul class=\"wp-block-latest-posts__list wp-block-latest-posts\"><li><a class=\"wp-block-latest-posts__post-title\" href=\"https:\/\/metaverseplanet.net\/blog\/call-of-duty-vs-battlefield\/\">Call of Duty vs. Battlefield: The Ultimate FPS Rivalry Hits the Big Screen<\/a><\/li>\n<li><a class=\"wp-block-latest-posts__post-title\" href=\"https:\/\/metaverseplanet.net\/blog\/the-robotaxi-that-thinks-like-a-human\/\">Inside Geely&#8217;s Eva Cab: The Robotaxi That Thinks Like a Human<\/a><\/li>\n<li><a class=\"wp-block-latest-posts__post-title\" href=\"https:\/\/metaverseplanet.net\/blog\/why-meta-is-sacrificing-8000-jobs-to-feed-the-algorithm\/\">The AI Earthquake: Why Meta is Sacrificing 8,000 Jobs to Feed the Algorithm<\/a><\/li>\n<\/ul>","protected":false},"excerpt":{"rendered":"<p>I spend a massive amount of time analyzing autonomous vehicle technologies, and to be completely honest, a lot of the announcements start to blend together after a while. Usually, it&#8217;s just a traditional car with a bulky sensor pod bolted to the roof and a minor software update. But when I was digging into the &hellip;<\/p>\n","protected":false},"author":1,"featured_media":43767,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAown96uCw:productID":"","footnotes":""},"categories":[119,336],"tags":[343,345],"class_list":["post-43764","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-robotic","category-futurescience","tag-future-mobility","tag-robot-news"],"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts\/43764","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/comments?post=43764"}],"version-history":[{"count":1,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts\/43764\/revisions"}],"predecessor-version":[{"id":43768,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts\/43764\/revisions\/43768"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/media\/43767"}],"wp:attachment":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/media?parent=43764"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/categories?post=43764"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/tags?post=43764"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}