It took 5 months to get here but I finally got my Dwarf 3 telescope and had a chance to take some images. Now the images right from the telescope don’t look this good but you can download the .fits files and process them yourself. So I figured I try to learn how to do that. I used this tutorial as a starting point (I skipped re-stacking the images myself) and got some nice results. These are taken from my backyard in light-polluted Dallas, TX.
I think this is the best solution at the moment. I’ve added a button (labeled “spatial”) over these that triggers fullscreen. If you’re using an Apple Vision Pro, putting these in fullscreen will allow you to see the 3D versions. Everyone else will get 2D versions.
I love having (what I think is) a cool website. I also want a blog. You can do that with WordPress but it’s a lot of work. It also seems that problems are likely to crop up at some point. They certainly did for me. I am absolutely not interested in trying to fix that stuff. So now I’ve got this new installation of WordPress on it’s on domain. I’m using the default theme with a tiny bit of additional CSS. Hopefully this will be easier to maintain.
I spent a little time over the weekend trying to bring The Phantom Moon to life. I created this in Blender, rendered out a still image, and then made a video out of it. This was recorded in my Apple Vision Pro.
In VisionOS 2.2 (I’m on the developer beta), you can now “tap to view spatial photos and videos embedded on web pages” in Safari.
So if you’re viewing this in Safari in VisionOS 2.2 you can select and hold the image which brings up a menu where you can select “View Spatial Photo.” To view the spatial video, tap it to start it playing and then tap the fullscreen button in the upper-left corner.
A couple of weeks ago Anthony Maës released an open source immersive video player based on Mike Swanson’s work. It’s also published as an app that you can download on the Vision Pro. It’s now my go-to app for playing immersive MV-HEVC videos. I was hopeful I could use this to make an app for my work.
For my purposes, I want an app that you can download and then just hit play to watch the thing I’ve made. Today that would be as close as I can get to the experience of watching Apple’s immersive videos in the TV app. So this weekend I set out to see if I could modify this player with a custom menu that only plays a video that I include with the app. Spoiler alert: I did it! Here’s roughly how:
I forked the Openimmersive GitHub repository.
I opened the forked project in Xcode.
I spent some time trying to make things work by hard-coding a stream URL into that section of the app. That worked, but streaming a 115 Mbit/sec video file from my shared hosting account sucks even with fiber internet. Not to mention this could get expensive.
So I copied the code for picking a file into ChatGPT and asked it to modify it to play a video file that was bundled with the app. It did it but it used AVKit which is Apple’s video playing framework. The issue with immersive video is that Apple doesn’t provide a framework for it. But this open source project is built on one (OpenImmersiveLib). So fixing this was a one-line change. I swapped out “import AVKit” with “import OpenImmersive” and it worked like a charm.
I deleted the menu parts I didn’t need.
I created a new app icon.
I learned how to exclude my giant video file from the GitHub repository.
The camera app in iOS 18 has a “Spatial” mode for iPhone 16 and 16 Pro. There you can switch between photos and video.
I’ve always wanted to make a B&W anaglyph photo zine and while running this morning I realized that I knew how to make this work for images taken with my new iPhone.
Select your images in the Photos desktop app and use command+option+e to export them all at once.
I’ve seen a few VR180 creators on Reddit talk about turning their 8K footage from the Canon R5C into 16K. The examples I’ve seen do look better so I thought I’d try it out. I used Topaz Labs Video AI to upscale a 30 second clip that I shot while we on vacation in Guerneville, CA a couple of weeks ago.
It took about 26 hours (!) to upscale the video on my Mac Studio (M1 Max processor). I just used the default settings. I imagine I can get better results in the future. Here are crops from the original and upscaled video.