{"id":24168,"date":"2025-07-08T14:37:34","date_gmt":"2025-07-08T14:37:34","guid":{"rendered":"https:\/\/metaverseplanet.net\/blog\/?p=24168"},"modified":"2026-01-05T09:13:49","modified_gmt":"2026-01-05T09:13:49","slug":"apple-developing-ai-powered-map-for-the-visually-impaired","status":"publish","type":"post","link":"https:\/\/metaverseplanet.net\/blog\/apple-developing-ai-powered-map-for-the-visually-impaired\/","title":{"rendered":"Apple Developing AI-Powered Map for the Visually Impaired"},"content":{"rendered":"\n<p>Technology is continuously expanding the possibilities for individuals with visual impairments. However, navigating daily life independently remains a significant challenge for them. Complex urban environments and insufficient infrastructure particularly complicate wayfinding. Addressing these needs, <strong><em><a href=\"https:\/\/metaverseplanet.net\/blog\/tag\/apple-news-and-content\/\" data-type=\"post_tag\" data-id=\"72\">Apple<\/a><\/em><\/strong> and <strong>Columbia University<\/strong> have developed an <strong>AI-powered system<\/strong> that allows visually impaired individuals to virtually explore streets.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Introducing SceneScout: Virtual Street Exploration<\/h2>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/79207c1bcc388a2d3eb5afdf39e9efd11c9ea095-1024x576.jpeg\" alt=\"\" class=\"wp-image-24169\" srcset=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/79207c1bcc388a2d3eb5afdf39e9efd11c9ea095-1024x576.jpeg 1024w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/79207c1bcc388a2d3eb5afdf39e9efd11c9ea095-300x169.jpeg 300w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/79207c1bcc388a2d3eb5afdf39e9efd11c9ea095-768x432.jpeg 768w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/79207c1bcc388a2d3eb5afdf39e9efd11c9ea095-390x220.jpeg 390w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/79207c1bcc388a2d3eb5afdf39e9efd11c9ea095-150x84.jpeg 150w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/79207c1bcc388a2d3eb5afdf39e9efd11c9ea095-scaled.jpeg 1200w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>Apple and Columbia University have collaboratively developed a new <strong>AI-powered system<\/strong> to help <strong>visually impaired<\/strong> or <strong>low-vision individuals<\/strong> explore streets more safely and independently. This prototype, named <strong>SceneScout<\/strong>, is currently in the research phase and can <strong>describe street views using Apple Maps data<\/strong>.<\/p>\n\n\n\n<p>SceneScout aims to reduce the uncertainty faced by visually impaired or low-vision individuals when deciding to travel in an unfamiliar environment. While most traditional tools offer real-time directions, this new system allows users to <strong>gather more information about their surroundings before traveling<\/strong>.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">How SceneScout Works: Two Exploration Modes<\/h2>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/a2938f380f06a50192efba3f552eac209fd683ee-1024x576.jpeg\" alt=\"\" class=\"wp-image-24170\" srcset=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/a2938f380f06a50192efba3f552eac209fd683ee-1024x576.jpeg 1024w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/a2938f380f06a50192efba3f552eac209fd683ee-300x169.jpeg 300w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/a2938f380f06a50192efba3f552eac209fd683ee-768x432.jpeg 768w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/a2938f380f06a50192efba3f552eac209fd683ee-390x220.jpeg 390w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/a2938f380f06a50192efba3f552eac209fd683ee-150x84.jpeg 150w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/07\/a2938f380f06a50192efba3f552eac209fd683ee-scaled.jpeg 1200w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>The system converts visual data into text descriptions. Users can virtually explore details they might encounter along a route or the general environment of a specific area. The developed system has two different usage modes:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Route Preview Mode:<\/strong> In this mode, users can view information such as <strong>sidewalk conditions<\/strong>, <strong>intersection layouts<\/strong>, and <strong>bus stops<\/strong> before starting their journey.<\/li>\n\n\n\n<li><strong>Virtual Exploration Mode:<\/strong> This mode allows users to define an environmental description based on their interests, and the system then provides directions by describing the streets accordingly.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">AI Integration and Initial Test Results<\/h2>\n\n\n\n<p>SceneScout is powered by a <strong>GPT-4o-based AI model<\/strong> that works with <strong>panoramic images obtained from Apple Maps<\/strong>. The system analyzes visual data and translates it into text in the form of <strong>short, medium, or long descriptions<\/strong>. The web interface is designed to be compatible with <strong>screen reader technologies<\/strong>.<\/p>\n\n\n\n<p>Initial tests involved <strong>10 visually impaired individuals<\/strong> experienced in using screen readers. Participants found the <strong>Virtual Exploration mode particularly useful<\/strong>. However, the test results also highlighted some shortcomings of the system. Approximately <strong>72% of the generated descriptions were accurate<\/strong>, but some descriptions contained incorrect information. For instance, a crossing without an auditory signal might be described as having one, or some street signs were misidentified.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h3 class=\"wp-block-heading\">You Might Also Like;<\/h3>\n\n\n<ul class=\"wp-block-latest-posts__list wp-block-latest-posts\"><\/ul>","protected":false},"excerpt":{"rendered":"<p>Technology is continuously expanding the possibilities for individuals with visual impairments. However, navigating daily life independently remains a significant challenge for them. Complex urban environments and insufficient infrastructure particularly complicate wayfinding. Addressing these needs, Apple and Columbia University have developed an AI-powered system that allows visually impaired individuals to virtually explore streets. Introducing SceneScout: Virtual &hellip;<\/p>\n","protected":false},"author":1,"featured_media":12578,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAown96uCw:productID":"","footnotes":""},"categories":[332],"tags":[335,210,72],"class_list":["post-24168","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai-information","tag-ai-news","tag-ai-tools-news","tag-apple-news-and-content"],"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts\/24168","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/comments?post=24168"}],"version-history":[{"count":0,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts\/24168\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/media\/12578"}],"wp:attachment":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/media?parent=24168"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/categories?post=24168"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/tags?post=24168"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}