My goal is to convert SwiftUI view to image. I am currently building it for iOS 15
here is the code that converts the view to image
extension View {
func snapshot() -> UIImage {
let controller = UIHostingController(rootView: self)
let view = controller.view
let targetSize = controller.view.intrinsicContentSize
view?.bounds = CGRect(origin: .zero, size: targetSize)
view?.backgroundColor = .clear
let renderer = UIGraphicsImageRenderer(size: targetSize)
return renderer.image { _ in
view?.drawHierarchy(in: controller.view.bounds, afterScreenUpdates: true)
}
}
}
Expected result: https://share.cleanshot.com/5Pvzb7
Actual result: https://share.cleanshot.com/O4GKUF
Below is the body of the view that has a button and image view
var body: some View {
VStack {
imageView
Spacer(minLength: 200)
Button {
UIImageWriteToSavedPhotosAlbum(imageView.snapshot(), nil, nil, nil)
} label: {
Label("Save to Photos", systemImage: "photo")
}
}
}
Imageview is hardcoded with data and image is loaded from the internet
var imageView: some View {
VStack {
ZStack(alignment: .top) {
Color.red
VStack {
VStack {
HStack(alignment: .top) {
Rectangle.init().frame(width: 56, height: 56).cornerRadius(28)
VStack {
HStack {
Text.init("Yura Filin").font(.system(size: 16))
Spacer()
}.padding(.bottom, 1)
HStack {
Text.init("qweqwewqeqqweqeasd asd asd aosidhsa doaid adoaid adiad hiu uh i hiu ih uhuih iuhiu ih asdi ").lineLimit(90).multilineTextAlignment(.leading).font(.system(size: 16))
Spacer()
}.padding(.bottom, 2)
HStack {
Image(uiImage: image)
.resizable()
.aspectRatio(contentMode: .fit)
.frame(width: (286 - 64 - 12))
.onReceive(imageLoader.didChange) { data in
self.image = UIImage(data: data) ?? UIImage() }.cornerRadius(14)
Spacer()
}
Spacer()
}.padding(.leading, 12).readSize { size in
print(size)
}
}
}.padding(16).background(.green).cornerRadius(12)
}.padding(64)
}
}.ignoresSafeArea()
}
Any help is appreciated. Thanks!
2
Answers
Unlike when a usual view is rendered and the sizing of that view is constrained by the device, that doesn’t happen here. So you need to apply a frame with defined width (and possibly height) to the view you’re capturing.
When I needed to do this, I created another view that isn’t rendered to the user and used it solely for rendering the image. That way, any sizing you have to define in the frame doesn’t mess up what the user sees.
It might take you a while to get it right, but try adding a frame modifier with desired width to the view you’re rendering. 😊
Here is the code as shown below which solves your problem and in order to call UIImageWriteToSavedPhotosAlbum() you must add the NSPhotoLibraryAddUsageDescription key to your Info.plist
Output :