« Submit Your Ideas for AutoCAD | Main | Change the AutoCAD Layout Background Color to Black »

29 June 2011


how may you open in the Autodesk Inventor Publisher Mobile for Android the .ipm file that i made in Photofly? the viewer dont have open option.


If you place your files in online storage such as Dropbox or email the file to yourself you will get the option to open the file with Inventor Publisher Mobile.

You can share your 3D photo scene by sharing just one fiel. Here is an example: http://labs.blogs.com/its_alive_in_the_lab/2011/06/sharing-your-photo-scenes-with-just-one-file.html

For additional information on what gets placed into each export file type, see http://labs.blogs.com/its_alive_in_the_lab/2011/06/project-photofly-what-you-is-see-isnt-always-what-you-get.html.

Thanks for this piece of advice !

A few questions, though :
- "Using more than 50 photos is not recommended as there are some maximum data size limits on the server and your job may be rejected or fail"
"While there is no hard-coded upper limit on the number of photos"
Could you be more precise ? Does it depend on wether many people are using Photofly at the same time ?

- "Wide angle camera lens work the best due to the underlying technology."
I've been triying to scan with Photofly a Pediment (Le Louvre, Paris, France : http://uppix.3dvf.com/?v=img9485.jpg ) without success : I was using a 100mm lens, I guess that might be an issue... Any advice on how to shoot this kind of far object ?
I had the feeling that taking many pictures (I shot more than 200) would help, but since there is a size limit on the server side, this won't be a good idea.
(Microsoft's Photosynth does get a nice point cloud from my set of photos : http://uppix.3dvf.com/?v=louvrepari.jpg )

- I've tried a few scans in some museums, with mixed results (failures, or nice models like this one : http://www.youtube.com/watch?v=tWC4ml4k4tE ).
I have the feeling that high ISO can also be an issue with Photofly.

- "Do not keep the camera still while rotating an object against a blank background. To Project Photofly the background is unlikely to look completely blank and it likely won’t work. Project Photofly is not a panoramic photo software. "
Would this work if the background is overexposed (you just have to use a deported flash directed at the background) ?
I was hoping to scan small objects using a macro lens (Canon 100mm macro), which by the way would also help me have a blurred background...

Thanks again for those tips !

Hello Shaan,

I did try to capture stills from a movie and I was quiet please with the fact I did not have stitch any picture. I think I had set it to capture one picture per second. Since it was not an hd camera the resulting mesh was not to great. A better camera might give better results.

But it's worth exploring


Thank you for sharing your experience in using images from video.



Thank you for all of the questions.

The number of photos is not dependant on the number of Photofly users it is just a good recommendations as too many or too little photos can result in bad or failed solutions.

A wide angel lens works best, but you could use other lenses like the 100mm.

Lighting makes a biog difference as you noted about high ISO as Project Photofly must be able to identify unique points in addition low or non-uniform lighting can result in a bad texture.

I would not suggest moving the object and instead move your photo locations and flash can cause problems with the texture mapping due to greater probability for non-uniform lighting.

These are only tips we have found using this new technology but experiment and you might find for the objects you are capturing some other tips or exceptions such as a macro to use the bokeh background to your advantage. Make sure you provide your feedback as well as share what you learn.


Thanks for the answer !

Looks like using a zoom lens can indeed give good results :
24 shots, using a Canon 70-200 f4 lens @100-140mm.

Probably the best result I've had so far.

"flash can cause problems with the texture mapping due to greater probability for non-uniform lighting."
Sure, actually I was thinking of using a flash off-camera, and a white light tent to get an even lighting (and a white, uniform background).

A question:
In the Getting Started A-Z video it mentions shooting every 5-10 degrees and also shooting from multiple angles. But in this post you mention that the number of images should be limited to 40-50. However, shooting every 10 degrees at one angle alone creates 36 images, shooting every 10 degrees from multiple angles will quickly add up to more than 50 images. E.g., 36 shots at two angles = 72 images, from three angles will be 108 images. Should the number of shots per rotation be decreased when shooting from multiple angles?


I have IPM Mobile installed (Galaxy S with 2.1) and the .ipm file in Dropbox (and emailed to myself) and still can't get the option to open it. Anyone else having problems?


I would email feedback@autocadws.com for suggestions.


108 images would work but just take more time but I would not exceed that. You may find you may not actually need every 5-10 degrees depending on the objects just think of getting the same unique point in three or four photos. I would shoot less and see how that works as you can always add more photos to your scene later for more detail.
When we mention 40-50 it is a general rule of thumb for most objects to process fast and capture much of the subject.

Shaan -

Thanks for the tips. I've been able to create some nice Photofly scenes. When I export to Maya using .OBJ, the imported mesh looks nice, but having trouble cleaning up geometry and UV's in Maya. Any tips on how to make the Photofly object mirror other objects created in Maya?




I am testing Photofly with images taken of an object inside a virtual environment (Second Life - SL) and I could use some advice as I am having problems getting a good mesh (not well synced and much excess background material on the mesh).

It is relatively simple to take screen captured images of an object and do so at exact intervals but resolution is only about 6000x3000 pixels and yields images of about 750KB each. The texture and lighting tends to be uniform (but this can be changed).

Will increasing the resolution of the image help or changing the colour of the background? Will giving the background a texture help even if the object has little?

Since I can control the background texture and object texture, I would like to know if there is anything specifically I can do to improve mesh quality taken from captured images within a virtual environment.


I am not a Maya user, sorry.

For a better model than you shown in the one image you emailed to me and simply select the medium mesh option instead of the draft in the upper left of the Photo Scene Editor. Each project is dependent on the photos taken as mentioned in the tips and video tutorials.

The comments to this entry are closed.