{"id":17703,"date":"2024-07-19T09:27:51","date_gmt":"2024-07-19T09:27:51","guid":{"rendered":"https:\/\/metaverseplanet.net\/blog\/?p=17703"},"modified":"2026-01-23T10:45:00","modified_gmt":"2026-01-23T10:45:00","slug":"openai-introduces-gpt-4o-mini-a-smaller-ai-model","status":"publish","type":"post","link":"https:\/\/metaverseplanet.net\/blog\/openai-introduces-gpt-4o-mini-a-smaller-ai-model\/","title":{"rendered":"OpenAI Introduces GPT-4o mini: A Smaller AI Model"},"content":{"rendered":"\n<p>A mini version of GPT-4o, the most advanced multi-AI model introduced by OpenAI a few months ago, has arrived.<\/p>\n\n\n\n<p>About two months ago, OpenAI unveiled a new multi-AI model called GPT-4o. This model stood out because it could produce outputs in text, image, audio, and video formats without needing to connect to other models. OpenAI has now announced GPT-4o Mini. According to third-party evaluations, GPT-4o was at the top of the market as the company&#8217;s most advanced model to date. Shortly after, Anthropic&#8217;s Claude 3.5 Sonnet was released, and it has been competing closely with GPT-4o. Since then, the two models have been in a tight battle.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">GPT-4o Mini has arrived<\/h2>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2024\/07\/OpenAI-Introduces-GPT-4o-mini-A-Smaller-AI-Model-1024x576.jpeg\" alt=\"\" class=\"wp-image-17704\" srcset=\"https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2024\/07\/OpenAI-Introduces-GPT-4o-mini-A-Smaller-AI-Model-1024x576.jpeg 1024w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2024\/07\/OpenAI-Introduces-GPT-4o-mini-A-Smaller-AI-Model-300x169.jpeg 300w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2024\/07\/OpenAI-Introduces-GPT-4o-mini-A-Smaller-AI-Model-768x432.jpeg 768w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2024\/07\/OpenAI-Introduces-GPT-4o-mini-A-Smaller-AI-Model-390x220.jpeg 390w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2024\/07\/OpenAI-Introduces-GPT-4o-mini-A-Smaller-AI-Model-150x84.jpeg 150w, https:\/\/metaverseplanet.net\/blog\/wp-content\/uploads\/2024\/07\/OpenAI-Introduces-GPT-4o-mini-A-Smaller-AI-Model-scaled.jpeg 1200w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>OpenAI, aiming to stand out from the competition, described the GPT-4o Mini model as &#8220;the most cost-effective small model on the market.&#8221; The model costs developers 15 cents for 1 million user tokens, but users have to pay 60 cents for every 1 million tokens purchased from the model. Oliver Godement, Head of API Product for OpenAI, stated that GPT-4o Mini is especially useful for businesses and start-ups. &#8220;The cost per intelligence is so favorable that I believe it will be used for customer support, software engineering, creative writing, and all sorts of tasks,&#8221; Godement said. &#8220;Every time we adopt a new model, new use cases emerge, and I think that&#8217;s going to be even more true for GPT-4o Mini.&#8221;<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><a href=\"javascript:void(0)\"><\/a>You may also like this content<\/h3>\n\n\n<ul class=\"wp-block-latest-posts__list wp-block-latest-posts\"><\/ul>","protected":false},"excerpt":{"rendered":"<p>A mini version of GPT-4o, the most advanced multi-AI model introduced by OpenAI a few months ago, has arrived. About two months ago, OpenAI unveiled a new multi-AI model called GPT-4o. This model stood out because it could produce outputs in text, image, audio, and video formats without needing to connect to other models. OpenAI &hellip;<\/p>\n","protected":false},"author":1,"featured_media":17359,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAown96uCw:productID":"","footnotes":""},"categories":[332],"tags":[335,65],"class_list":["post-17703","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai-information","tag-ai-news","tag-chatgpt-news-and-content"],"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts\/17703","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/comments?post=17703"}],"version-history":[{"count":1,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts\/17703\/revisions"}],"predecessor-version":[{"id":40650,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/posts\/17703\/revisions\/40650"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/media\/17359"}],"wp:attachment":[{"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/media?parent=17703"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/categories?post=17703"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/metaverseplanet.net\/blog\/wp-json\/wp\/v2\/tags?post=17703"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}