{"id":36687,"date":"2025-12-16T08:02:16","date_gmt":"2025-12-16T08:02:16","guid":{"rendered":"https:\/\/metaverseplanet.net\/blog\/?p=36687"},"modified":"2026-01-05T08:00:04","modified_gmt":"2026-01-05T08:00:04","slug":"ai-diaries-this-week-in-the-world-of-ai-december-16-2025","status":"publish","type":"post","link":"https:\/\/metaverseplanet.net\/blog\/ai-diaries-this-week-in-the-world-of-ai-december-16-2025\/","title":{"rendered":"AI Diaries: This Week in the World of AI (December 16, 2025)"},"content":{"rendered":"\n<p>What happened this week in the <strong>AI world<\/strong>; What new things were introduced? In our &#8220;<strong><em><a href=\"https:\/\/metaverseplanet.net\/blog\/tag\/ai-diaries\/\" data-type=\"post_tag\" data-id=\"285\">AI Diaries<\/a><\/em><\/strong>&#8221; series, we continue to record weekly what is happening in this rapidly growing field.<\/p>\n\n\n\n<p><strong><em><a href=\"https:\/\/metaverseplanet.net\/blog\/ai-blog\/\" data-type=\"category\" data-id=\"48\">Artificial intelligence technologies<\/a><\/em><\/strong>, which have begun to be used much more widely in daily life recently, continue to develop rapidly. While a remarkable development occurs in this field almost every day, significant thresholds are being crossed one by one. In our &#8220;<strong>AI Diaries<\/strong>&#8221; series, we continue to regularly record the development of this <strong>technological revolution<\/strong>, which has the potential to significantly change life on earth in the coming period.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>What Happened This Week in the AI World?<\/strong><\/h2>\n\n\n\n<p> The most talked-about company in the AI world this week was <strong>OpenAI<\/strong>. While releasing <strong>GPT-5.2<\/strong>, its most advanced model to date, the company also continued to take steps to make <strong>ChatGPT<\/strong> more popular. On the other hand, the <strong>effects of AI on humans<\/strong> were opened to debate again this week. Parallel to these, as every week, new <strong><em><a href=\"https:\/\/metaverseplanet.net\/blog\/artificial-intelligence-tools\/\" data-type=\"category\" data-id=\"50\">AI tools<\/a><\/em><\/strong> entered our lives this week. Here are the prominent developments of this week in the AI world:<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>OpenAI Released GPT-5.2<\/strong><\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2024\/06\/The-owner-of-ChatGPT-donates-half-of-his-fortune-2.jpg\" alt=\"\" class=\"wp-image-17976\" srcset=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2024\/06\/The-owner-of-ChatGPT-donates-half-of-his-fortune-2.jpg 1024w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2024\/06\/The-owner-of-ChatGPT-donates-half-of-his-fortune-2-300x169.jpg 300w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2024\/06\/The-owner-of-ChatGPT-donates-half-of-his-fortune-2-768x432.jpg 768w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2024\/06\/The-owner-of-ChatGPT-donates-half-of-his-fortune-2-390x220.jpg 390w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2024\/06\/The-owner-of-ChatGPT-donates-half-of-his-fortune-2-150x84.jpg 150w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p><strong>Its Most Advanced Model to Date<\/strong> <strong>OpenAI<\/strong> announced <strong><em><a href=\"https:\/\/metaverseplanet.net\/blog\/gpt-5-2-unveiled-faster-more-accurate-more-powerful\/\" data-type=\"post\" data-id=\"36322\">GPT-5.2<\/a><\/em><\/strong>, its most advanced <strong>artificial intelligence model<\/strong> to date. The company says the new model is its best product yet for daily professional use. According to OpenAI\u2019s statement, <strong>GPT-5.2<\/strong> performs better than previous models in <strong>spreadsheet creation<\/strong>, <strong>presentation preparation<\/strong>, <strong>visual perception<\/strong>, <strong>coding<\/strong>, and <strong>understanding long contexts<\/strong>.<\/p>\n\n\n\n<p><strong>GPT-5.2<\/strong> has started to be offered in <strong>ChatGPT<\/strong>&#8216;s paid plans (Plus, Pro, Go, Business, and Enterprise). However, the company says the release will be distributed gradually to ensure the user experience remains smooth and stable.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>The Proliferation of AI Changes Human Behavior<\/strong><\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"940\" height=\"529\" src=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/11\/A-Robotic-Eye-That-Sees-Better-Than-Humans-Has-Been-Developed.webp\" alt=\"\" class=\"wp-image-33886\" srcset=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/11\/A-Robotic-Eye-That-Sees-Better-Than-Humans-Has-Been-Developed.webp 940w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/11\/A-Robotic-Eye-That-Sees-Better-Than-Humans-Has-Been-Developed-300x169.webp 300w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/11\/A-Robotic-Eye-That-Sees-Better-Than-Humans-Has-Been-Developed-768x432.webp 768w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/11\/A-Robotic-Eye-That-Sees-Better-Than-Humans-Has-Been-Developed-390x220.webp 390w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/11\/A-Robotic-Eye-That-Sees-Better-Than-Humans-Has-Been-Developed-150x84.webp 150w\" sizes=\"(max-width: 940px) 100vw, 940px\" \/><\/figure>\n\n\n\n<p> Research published this week by the <strong>Center for Adaptive Rationality<\/strong> at the <strong>Max Planck Institute for Human Development<\/strong> revealed that the way people speak and write has begun to resemble the language produced by <strong>large language models<\/strong>. The research points out a noticeable increase in the use of certain words such as &#8220;<strong>bolster<\/strong>,&#8221; &#8220;<strong>comprehend<\/strong>,&#8221; &#8220;<strong>meticulous<\/strong>,&#8221; and &#8220;<strong>swift<\/strong>&#8221; in the speeches of YouTube users in the 18 months following the launch of <strong>ChatGPT<\/strong>. Moreover, this situation is not just a phenomenon revealed by data analysis; it is stated that this change has become palpable in internet communities, the rhetoric of public figures, and even in ordinary daily life. The study offers quantitative evidence showing that as individuals come into contact with <strong>large language models<\/strong>, their vocabulary choices change, meaning they become &#8220;<strong>ChatGPT-ized<\/strong>.&#8221;<\/p>\n\n\n\n<p>An article published in the <strong>New York Times<\/strong> this week revealed that as video-generating <strong>AI tools<\/strong> like <strong>Sora<\/strong> and <strong>Veo3<\/strong> develop, the <strong>disinformation crisis<\/strong> on social media grows even larger. People are struggling to distinguish between real videos and <strong>AI productions<\/strong>. This situation also shakes trust in real videos. The article emphasizes that AI and internet companies are not taking sufficient measures to prevent this <strong>disinformation<\/strong>.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Adobe Applications Integrated into ChatGPT<\/strong><\/h2>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe title=\"Adobe x ChatGPT | Adobe\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/uouNjFuJ3QU?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p> <strong><em><a href=\"https:\/\/metaverseplanet.net\/blog\/tag\/adobe\/\" data-type=\"post_tag\" data-id=\"199\">Adobe<\/a><\/em><\/strong> has made <strong>Photoshop<\/strong>, <strong>Acrobat<\/strong>, and <strong>Adobe Express<\/strong> available within <strong><em><a href=\"https:\/\/metaverseplanet.net\/blog\/tag\/chatgpt-news-and-content\/\" data-type=\"post_tag\" data-id=\"65\">ChatGPT<\/a><\/em><\/strong>. Now, ChatGPT users will be able to <strong>edit images<\/strong>, <strong>personalize designs<\/strong>, and <strong>manage documents<\/strong> via the chatbot interface. Adobe states that it wants to make its tools more accessible with this integration. After connecting the applications to ChatGPT from the Settings &#8211; Apps and Connections section, you can find the apps by clicking the plus sign next to the chat area and selecting &#8220;More.&#8221; To try the new features, simply type the name of the Adobe app followed by the action you want. For example, when you want to edit the background of a picture, entering the command &#8220;Adobe Photoshop, help me blur the background of this picture&#8221; is sufficient.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Copyright Fight Turns into Licensing Agreements: OpenAI Strikes Deal with Disney<\/strong><\/h2>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"683\" src=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/11\/Star-wars-1-1024x683.png\" alt=\"\" class=\"wp-image-33256\" srcset=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/11\/Star-wars-1-1024x683.png 1024w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/11\/Star-wars-1-300x200.png 300w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/11\/Star-wars-1-768x512.png 768w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/11\/Star-wars-1-150x100.png 150w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/11\/Star-wars-1.png 1200w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p> The unauthorized use of <strong>copyrighted works<\/strong> and characters by <strong>AI companies<\/strong> has turned into a legal struggle between creators and AI companies today. However, events this week show that this <strong>copyright fight<\/strong> might give way to <strong>licensing agreements<\/strong>. Indeed, this is exactly what happened between <strong>OpenAI<\/strong> and <strong>Disney<\/strong>. Disney, which normally pursued OpenAI for copyright infringement, reached an agreement with OpenAI this week, allowing <strong>Disney characters<\/strong> to be used by <strong>ChatGPT<\/strong> and <strong>Sora<\/strong>. In return, Disney invested <strong>$1 billion<\/strong> in OpenAI, becoming one of the investors in the rapidly growing company.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>World&#8217;s Smallest &#8220;Supercomputer&#8221; Unveiled<\/strong><\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"720\" height=\"480\" src=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/tiny-computer.avif\" alt=\"\" class=\"wp-image-36689\" srcset=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/tiny-computer.avif 720w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/tiny-computer-300x200.avif 300w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/tiny-computer-150x100.avif 150w\" sizes=\"(max-width: 720px) 100vw, 720px\" \/><\/figure>\n\n\n\n<p> <strong>Tiiny AI<\/strong> introduced the <strong>Tiiny AI Pocket Lab<\/strong>, which it defines as the world&#8217;s smallest <strong>AI supercomputer<\/strong>. Measuring only 14.2 \u00d7 8 \u00d7 2.53 cm and weighing approximately 300 grams, the device is stated to be able to run <strong>120 billion parameter LLM models<\/strong> completely <strong>locally<\/strong> within such a small body. Supported models include widespread solutions like <strong>GPT-OSS<\/strong>, <strong>Llama<\/strong>, <strong>Qwen<\/strong>, <strong>DeepSeek<\/strong>, <strong>Mistral<\/strong>, and <strong>Phi<\/strong>.<\/p>\n\n\n\n<p>The 12-core processor with <strong>ARMv9.2 architecture<\/strong> at the heart of the device is supported by a special heterogeneous structure (<strong>SoC + discrete dNPU<\/strong>) designed to accelerate AI loads. This structure offers a total computing power of <strong>190 TOPS<\/strong>. One of the critical elements in the device&#8217;s performance, <strong>80 GB LPDDR5X memory<\/strong> and <strong>1 TB SSD storage<\/strong>, is also noteworthy. This capacity makes it possible to run massive models smoothly thanks to aggressive <strong>quantization techniques<\/strong>.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>First Earbuds Capable of Recording with AI Help Unveiled: TicNote Pods<\/strong> <\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"720\" height=\"405\" src=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/Worlds-First-AI-Powered-Recording-Earbuds-Unveiled-2.avif\" alt=\"\" class=\"wp-image-36470\" srcset=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/Worlds-First-AI-Powered-Recording-Earbuds-Unveiled-2.avif 720w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/Worlds-First-AI-Powered-Recording-Earbuds-Unveiled-2-300x169.avif 300w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/Worlds-First-AI-Powered-Recording-Earbuds-Unveiled-2-390x220.avif 390w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/Worlds-First-AI-Powered-Recording-Earbuds-Unveiled-2-150x84.avif 150w\" sizes=\"(max-width: 720px) 100vw, 720px\" \/><\/figure>\n\n\n\n<p><strong><em><a href=\"https:\/\/metaverseplanet.net\/blog\/worlds-first-ai-powered-recording-earbuds-unveiled\/\" data-type=\"post\" data-id=\"36468\">TicNote Pods<\/a><\/em><\/strong> holds the title of being the world&#8217;s first <strong>4G AI-supported note-taking earbuds<\/strong>. Designed by <strong>Mobvoi<\/strong>, these earbuds can do much more than play music or make phone calls. With the built-in <strong>4G eSIM<\/strong> and customized <strong>Shadow AI<\/strong> system, TicNote Pods can record, synchronize, and <strong>transcribe<\/strong> conversations completely on their own.<\/p>\n\n\n\n<p><strong>TicNote Pods<\/strong> are designed for use in both personal and professional environments. When worn, they can clearly record in-ear conversations, online meetings, Zoom calls, phone calls, interviews, or lectures at school. The earbuds&#8217; charging case can also be used as a standalone <strong>recording device<\/strong>. You can record conversations in the environment just by placing it on the table.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Amazon Alexa+ Will Automatically Buy Price-Dropped Products<\/strong> <\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"720\" height=\"405\" src=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/alexa1.avif\" alt=\"\" class=\"wp-image-36690\" srcset=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/alexa1.avif 720w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/alexa1-300x169.avif 300w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/alexa1-390x220.avif 390w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2025\/12\/alexa1-150x84.avif 150w\" sizes=\"(max-width: 720px) 100vw, 720px\" \/><\/figure>\n\n\n\n<p><strong>Amazon<\/strong> is turning its <strong>AI-powered assistant<\/strong> into a more aggressive shopping engine by activating new <strong>shopping-focused capabilities<\/strong> for <strong>Alexa+<\/strong>. The updates announced by the company directly target both deal hunters and frequent orderers. The most notable innovation is <strong>Alexa+<\/strong>&#8216;s ability to <strong>automatically purchase<\/strong> a specific product the moment it drops to a price set by the user. The user only states the desired product and the amount they are willing to pay; when the product enters a campaign, the assistant automatically places the order using the default payment method and delivery address.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Alternative AI Browser from Google to Chrome: Disco<\/strong> <\/h2>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe title=\"Turn tabs into a custom app with GenTabs in Disco, a new Google Labs experiment\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/9CKeTgcMjzc?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p><strong>Google<\/strong>&#8216;s Chrome team recently developed a new browser. This experimental browser automatically opens many related tabs when it receives a question or command and then creates a <strong>mini-application<\/strong> specific to the task the user wants to do. For example, if someone asks for travel suggestions, it creates a trip planner; if they ask for study help, it creates a study card system. In short, it offers an experience between searching on Google and <strong>intuitive coding<\/strong> (<strong>vibe coding<\/strong>). This concept is called <strong>GenTabs<\/strong>, and the browser itself is called <strong>Disco<\/strong>. Google is testing whether these have a place in the future of the web by launching them experimentally under <strong>Search Labs<\/strong>.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>AI Tools Released This Week<\/strong><\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>AutoGLM<\/strong>, developed by Chinese AI company <strong>Z.ai<\/strong>, is one of the most remarkable examples of an <strong>AI agent<\/strong> we have seen so far. AutoGLM can perform many tasks <strong>autonomously<\/strong> by using the phone screen almost like a human. At least, the demos shared so far show this.<\/li>\n\n\n\n<li>The AI tool named <strong>Wan-Move<\/strong>, published by <strong>Alibaba<\/strong>, allows users to determine which direction characters and objects in videos will move. It is sufficient for the user to draw the direction of movement.<\/li>\n\n\n\n<li><strong>OneStory<\/strong>, published by <strong>Meta<\/strong>, can produce multiple <strong>visually consistent<\/strong> clips based on a single written command or image.<\/li>\n\n\n\n<li><strong>RealGen<\/strong>, introduced this week, stands out as one of the best open models on the market for producing <strong>photorealistic images<\/strong>.<\/li>\n\n\n\n<li><strong>EgoEdit<\/strong>, developed by the company behind Snapchat (<strong>Snap<\/strong>), can edit videos in <strong>real-time<\/strong> and make various manipulations on them. This tool is thought to be particularly useful for content creators and <strong>VR content<\/strong>.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Shorts from the AI World<\/strong><\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>OpenAI<\/strong> announced that <strong>ChatGPT<\/strong>&#8216;s <strong>adult mode<\/strong> will be released in early <strong>2026<\/strong>.<\/li>\n\n\n\n<li>The <strong>US<\/strong> approved the conditional export of <strong>Nvidia<\/strong>&#8216;s <strong>H200 chips<\/strong> to <strong>China<\/strong>. The US also reported that companies like <strong>AMD<\/strong> and <strong>Intel<\/strong> would be allowed and that companies would take a 25 percent share of sales. Despite this development, China plans to use <strong>domestic chips<\/strong>.<\/li>\n\n\n\n<li><strong>Meta<\/strong> is developing a new and closed-source AI model codenamed &#8220;<strong>Avocado<\/strong>.&#8221; The model, expected to be released in the first quarter of <strong>2026<\/strong>, implies that the company will move away from its <strong>open-source<\/strong> approach.<\/li>\n\n\n\n<li>The <strong>European Union<\/strong> launched an investigation against <strong>Google<\/strong>, scrutinizing allegations that it used publishers&#8217; content and YouTube videos in AI models without permission and with insufficient copyright.<\/li>\n\n\n\n<li><strong>Spotify<\/strong> is testing a feature for Premium users that allows them to control the algorithm and the general listening experience on the platform. This feature, called &#8220;<strong>Prompted Playlist<\/strong>,&#8221; will make it easier to find exactly the music you are looking for with <strong>AI support<\/strong>.<\/li>\n\n\n\n<li>According to information reported by <strong>Axios<\/strong>, <strong>OpenAI management<\/strong> thinks that their most advanced models, which they have not yet released, could cause greater <strong>cybersecurity problems<\/strong>.<\/li>\n\n\n\n<li><strong>Unconventional<\/strong>, an AI company founded two months ago, reached a valuation of <strong>$4.5 billion<\/strong>. The company&#8217;s main focus is to develop <strong>innovative technologies<\/strong> and hardware that will reduce the increasing <strong>energy consumption<\/strong> and computing costs of AI models.<\/li>\n\n\n\n<li><strong>Intel<\/strong> is moving to a new strategy focused on developing <strong>local AI<\/strong> and customer-specific <strong>ASIC chips<\/strong>, instead of direct <strong>GPU competition<\/strong> with <strong>NVIDIA<\/strong> and <strong>AMD<\/strong> in AI.<\/li>\n\n\n\n<li>According to a new study, <strong>AI models<\/strong> with <strong>reasoning capabilities<\/strong> consume an average of <strong>100 times more energy<\/strong> than standard models without this function.<\/li>\n\n\n\n<li><strong>Google<\/strong> brought <strong>Gemini AI integration<\/strong> to <strong>Chrome<\/strong> on iPhone and iPad. Users will now be able to perform page summarization, content explanation, and topic-based queries directly through the browser.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">You Might Also Like;<\/h3>\n\n\n<ul class=\"wp-block-latest-posts__list wp-block-latest-posts\"><li><a class=\"wp-block-latest-posts__post-title\" href=\"https:\/\/metaverseplanet.net\/blog\/the-dark-side-of-nanotechnology\/\">The Dark Side of Nanotechnology: Could Microscopic Swarms Erase Billions?<\/a><\/li>\n<li><a class=\"wp-block-latest-posts__post-title\" href=\"https:\/\/metaverseplanet.net\/blog\/the-illusion-of-digital-immortality\/\">The Illusion of Digital Immortality: Are You Really Uploading Your Mind?<\/a><\/li>\n<li><a class=\"wp-block-latest-posts__post-title\" href=\"https:\/\/metaverseplanet.net\/blog\/artemis-2s-deep-space-eclipse\/\">The View That Changes Everything: Artemis 2\u2019s Deep Space Eclipse<\/a><\/li>\n<\/ul>","protected":false},"excerpt":{"rendered":"<p>What happened this week in the AI world; What new things were introduced? In our &#8220;AI Diaries&#8221; series, we continue to record weekly what is happening in this rapidly growing field. Artificial intelligence technologies, which have begun to be used much more widely in daily life recently, continue to develop rapidly. While a remarkable development &hellip;<\/p>\n","protected":false},"author":1,"featured_media":34987,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAown96uCw:productID":"","footnotes":""},"categories":[332],"tags":[285,335],"class_list":["post-36687","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai-information","tag-ai-diaries","tag-ai-news"],"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts\/36687","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/comments?post=36687"}],"version-history":[{"count":0,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts\/36687\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/media\/34987"}],"wp:attachment":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/media?parent=36687"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/categories?post=36687"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/tags?post=36687"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}