{"id":37804,"date":"2022-12-02T15:25:04","date_gmt":"2022-12-02T20:25:04","guid":{"rendered":"https:\/\/mjtsai.com\/blog\/?p=37804"},"modified":"2022-12-02T15:25:54","modified_gmt":"2022-12-02T20:25:54","slug":"stable-diffusion-with-core-ml-on-apple-silicon","status":"publish","type":"post","link":"https:\/\/mjtsai.com\/blog\/2022\/12\/02\/stable-diffusion-with-core-ml-on-apple-silicon\/","title":{"rendered":"Stable Diffusion With Core ML on Apple Silicon"},"content":{"rendered":"<p><a href=\"https:\/\/machinelearning.apple.com\/research\/stable-diffusion-coreml-apple-silicon\">Apple<\/a> (<a href=\"https:\/\/news.ycombinator.com\/item?id=33822157\">Hacker News<\/a>):<\/p>\n<blockquote cite=\"https:\/\/machinelearning.apple.com\/research\/stable-diffusion-coreml-apple-silicon\"><p>Today, we are excited to release optimizations to Core ML for <a href=\"https:\/\/stability.ai\/blog\/stable-diffusion-announcement\">Stable Diffusion<\/a> in macOS 13.1 and iOS 16.2, along with code to get started with deploying to Apple Silicon devices.<\/p><p>[&#8230;]<\/p><p>Beyond image generation from text prompts, developers are also discovering other creative uses for Stable Diffusion, such as image editing, in-painting, out-painting, super-resolution, style transfer and even color palette generation.<\/p><p>[&#8230;]<\/p><p>To learn more about how we optimized a model of this size and complexity to run on the Apple Neural Engine, you can check out our previous article on <a href=\"https:\/\/machinelearning.apple.com\/research\/neural-engine-transformers\">Deploying Transformers on the Apple Neural Engine<\/a>. The optimization principles outlined in the article generalize to Stable Diffusion despite the fact that it is 19x larger than the model studied in the previous article. Optimizing Core ML for Stable Diffusion and simplifying model conversion makes it easier for developers to incorporate this technology in their apps in a privacy-preserving and economically feasible way, while getting the best performance on Apple Silicon.<\/p><\/blockquote>\n\n<p><a href=\"https:\/\/github.com\/apple\/ml-stable-diffusion\">Core ML Stable Diffusion<\/a>:<\/p>\n<blockquote cite=\"https:\/\/github.com\/apple\/ml-stable-diffusion\"><p>This repository comprises:<\/p><ul><li><code>python_coreml_stable_diffusion<\/code>, a Python package for converting PyTorch models to Core ML format and performing image generation with Hugging Face <a href=\"https:\/\/github.com\/huggingface\/diffusers\">diffusers<\/a> in Python<\/li><li><code>StableDiffusion<\/code>, a Swift package that developers can add to their Xcode projects as a dependency to deploy image generation capabilities in their apps. The Swift package relies on the Core ML model files generated by <code>python_coreml_stable_diffusion<\/code><\/li><\/ul><\/blockquote>\n<p>An M2 MacBook Air is significantly faster than an M1 Pro MacBook Pro.<\/p>\n\n<p>Previously:<\/p>\n<ul>\n<li><a href=\"https:\/\/mjtsai.com\/blog\/2022\/09\/23\/midjourney-and-stable-diffusion\/\">Midjourney and Stable Diffusion<\/a><\/li>\n<\/ul>","protected":false},"excerpt":{"rendered":"<p>Apple (Hacker News): Today, we are excited to release optimizations to Core ML for Stable Diffusion in macOS 13.1 and iOS 16.2, along with code to get started with deploying to Apple Silicon devices.[&#8230;]Beyond image generation from text prompts, developers are also discovering other creative uses for Stable Diffusion, such as image editing, in-painting, out-painting, [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"apple_news_api_created_at":"2022-12-02T20:25:06Z","apple_news_api_id":"0166e74c-d5fe-4e9d-acb6-5110654d7669","apple_news_api_modified_at":"2022-12-02T20:25:56Z","apple_news_api_revision":"AAAAAAAAAAAAAAAAAAAAAQ==","apple_news_api_share_url":"https:\/\/apple.news\/AAWbnTNX-Tp2stlEQZU12aQ","apple_news_coverimage":0,"apple_news_coverimage_caption":"","apple_news_is_hidden":false,"apple_news_is_paid":false,"apple_news_is_preview":false,"apple_news_is_sponsored":false,"apple_news_maturity_rating":"","apple_news_metadata":"\"\"","apple_news_pullquote":"","apple_news_pullquote_position":"","apple_news_slug":"","apple_news_sections":"\"\"","apple_news_suppress_video_url":false,"apple_news_use_image_component":false,"footnotes":""},"categories":[2],"tags":[2014,1351,31,2185,2223,74,138,71,232,2281,901],"class_list":["post-37804","post","type-post","status-publish","format-standard","hentry","category-technology","tag-apple-m1","tag-artificial-intelligence","tag-ios","tag-ios-16","tag-macos-13-ventura","tag-opensource","tag-optimization","tag-programming","tag-python","tag-stable-diffusion","tag-swift-programming-language"],"apple_news_notices":[],"_links":{"self":[{"href":"https:\/\/mjtsai.com\/blog\/wp-json\/wp\/v2\/posts\/37804","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mjtsai.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/mjtsai.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/mjtsai.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/mjtsai.com\/blog\/wp-json\/wp\/v2\/comments?post=37804"}],"version-history":[{"count":3,"href":"https:\/\/mjtsai.com\/blog\/wp-json\/wp\/v2\/posts\/37804\/revisions"}],"predecessor-version":[{"id":37807,"href":"https:\/\/mjtsai.com\/blog\/wp-json\/wp\/v2\/posts\/37804\/revisions\/37807"}],"wp:attachment":[{"href":"https:\/\/mjtsai.com\/blog\/wp-json\/wp\/v2\/media?parent=37804"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/mjtsai.com\/blog\/wp-json\/wp\/v2\/categories?post=37804"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/mjtsai.com\/blog\/wp-json\/wp\/v2\/tags?post=37804"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}