Here is a nice overview of Photo Sphere if you haven't seen it yet. |
A rendered Photo Sphere of a Cityscape. I found this scene on Blendswap by Dimmyxv. |
- Exporting photo spheres to an image.
- Uploading a rendered photo sphere to Google+.
- Downloading photo sphere from Google+.
- Importing photo spheres in Blender.
- Future thoughts.
Exporting Photo Spheres
I only know of two ways of doing this: baking a texture using reflection and using the Equirectangular Camera. The first way can only be done using Blender's internal render engine and second can only be done using cycles.
Scene Setup
To change the perspective: with the camera object selected, turn on "Properties Panel -> Object Data -> Lens -> Panoramic". The default type is FishEye, but change it to Equirectangular.
Move the camera to a good location like the center (tip: use Alt+G and Alt+R to quickly clear the location/rotation). The camera needs to be at eye level; I just choose an arbitrary number like 2 meters (1 blender unit is 1 meter). To do this in feet/meters, just go to "Properties -> Scene -> Units". After adding a subject around your camera, you should have a scene that looks something like this:
Move the camera to a good location like the center (tip: use Alt+G and Alt+R to quickly clear the location/rotation). The camera needs to be at eye level; I just choose an arbitrary number like 2 meters (1 blender unit is 1 meter). To do this in feet/meters, just go to "Properties -> Scene -> Units". After adding a subject around your camera, you should have a scene that looks something like this:
Render the Scene
Switching to Camera View (numpad 0) and to Render Shading will now show what the final flat photo sphere image is going to look like:
Uploading Rendered Photo Spheres
For Google+ to accept a photo as a photo sphere, it needs some specific XMP info encoded in the file. If you don't know how to add this yourself, Google provides an online converter for free (typically used for Google Earth/Maps/Street View). I found that PNG files won't work (the download comes back as 0 bytes in size) but JPG files work fine. The converter will ask for compass heading, horizontal FOV, and vertical FOV; make sure to set the vertical FOV to 180 and horizontal FOV to 360.
Ok, now you have your image with the XMP data. Simply upload it to Google+ and it will automatically detect that it as a photo sphere. Try it out: Cityscape and Basic Scene.
Downloading Photo Spheres From Google+
For Google+, just go to the photo you want to download - like this. There should be a download link at the bottom left "Options -> Download Full Size".
Importing Photo Spheres in Blender
For cycles, go to "Properties -> World" and and change the surface to an "Environment Texture" (you might need to enable nodes). Open the image you want to be the background. You can change the projection to either Equirectangular or Mirror Ball depending on the type of photo you have.
Background set to "Environment Texture" |
Future Thoughts
Equirectangular Video Player
Google+, Photosynth, and others sites typically just handle 2-D photos. It would be neat to make an "Equirectangular Video Player" cross platform and easy to use - if it were me, I would probably write it in JavaScript and WebGL. Players exist today, but aren't what they should be... it is definitely possible (e.g. Kolor Eyes and krpano) to render a series of equirectangular photos to video today. Ultimately, it should be easily accessible to everyone much like YouTube or Vimeo and have a polished look/feel like Street View.
Emerging Technology
These types of 3-D images/videos are useful to others and me because of emerging technology increasing in performance, much like the Oculus Rift. Watching an Equirectangular Video on a 2-D screen makes it hard to understand what is going on unless you can move the screen around with your head.
Hi interesting article. I had no idea of the XMP data thing, thanks for sharing.
ReplyDelete"Blender won't let me set the focal length"
Can you elaborate on that? Blender doesn't let you change the focal length for panorama renders because it makes no sense at all ;)
I fail to see, however, how does that relate to your needs of setting the ideal image size.
Also, I would recommend to work with 2:1 images.
That can make the most uniform distribution of the pixels. I find strange the claim that G+ can't handle this aspect ratio properly, given this is the defacto ratio for equirectangular panoramas.
cheers
"Blender won't let me set the focal length", that isn't wrong ;) But seriously, focal length is probably the wrong way to explain it or correct for it. I was trying to talk about the path that the objects take around the camera and how it is distorted. If you look at one of the objects in the scene, it doesn't take a circular path - it takes an elliptical one.
DeleteLooking at the difference between x:1 images and 1:1 images I uploaded, there there seemed to be a lot more stitching issues with images that are not 1:1 in the horizon and objects - pixels distribution wasn't distorted enough for me to tell the difference. I went back to to take snapshots for you but it really isn't that much difference in stitching. The horizon looks a tiny bit better in the 1:1 but it could be because of the number of samples in the render. I'll take this part out of the article because I think you're definitely right long term. Thanks.
Hi Brian! Thanks for this great tutorial, it is just what I was trying to do.
ReplyDeleteI have photo spheres taken from my nexus, and I want to import them into blender, sort of like in the inside of a sphere, so you can recreate the animation of being there. I am glad you did it first, because I am absolutely new to Blender.
However, following the tutorial, I realized that my menu is different, although we have the same version of blender! Do you have any ideas of what I could do?
http://i.imgur.com/R9Y8UjY.png
Yeah, at the top of your screen you are probably using "Blender Render" as the rendering engine. To use this technique, try using "Cycles". Here is a good intro:
Deletehttp://www.youtube.com/watch?v=UTwXG3K4l2g
Blender's Rendering engine uses the "baking a texture" example in the post. I didn't talk about it because I prefer to use Cycles but the user's video talks a great deal on it anyways.