@martinsiegrist – indeed if I were dealing with an existing object, photogrammetry would be an attractive method. But all I have is photos from the 1920 to 1936 period, and alas the subject is long gone. So photo-matching it is.
@davidcockey – so I tried the Picture command (Surface > Plane > Picture), and I see that what it does is to set up a surface within 3D space which has the image rendered onto it. If in perspective view the orientation axes are altered, the image itself is treated as an object projected with perspective. That means that if I adjust the orientation of the 3D model, the image too alters its perspective. With Wallpaper, that does not happen, and that’s a good thing, because then it is possible - in theory - to orient the model to exactly coincide with the image. Various Youtube videos demonstrate the idea, for example " [VRay for Rhino] Photo Perspective Match", https://www.youtube.com/watch?v=451rQR_FRPw . Admittedly, the various videos there are all rather long-winded and all add their own “special sauce” (e.g., you don’t need V-Ray as in the video I cited, and you don’t need to Photoshop distort the image into 2D perspective as another video does). But the basic principle is there. Import the image as Wallpaper, set up the 3D model in front, and then pan, zoom, rotate and play with focal length until you get a fit.
Actually, the best – most sophisticated – version of this approach was developed by Witold Jaworski for Blender. He describes it in his pdf book “Virtual Airplane” Vol.2, but also describes it in two pdfs on his blog, back in 2019:
I have been using his process with some success. In Blender, each camera can be set up with its own background image – the equivalent of Wallpaper in Rhino. The extra sophistication in Jaworski’s method are to couple the camera to a target using a “Track To” constraint, which provides the ability to adjust both the camera location and the direction of its view axis (which has the effect of x-y adjustment in the picture plane, independently of the camera location). Rhino can sort of do that too. He also wrote a script that adjusts the camera distance automatically when the focal length is changed, to save the time and effort of manually re-adjusting the camera distance from the object, and that is very nice to have although not essential. The biggest deal is that Blender, as I noted in my earlier post above, has the simple but very necessary ability to adjust the depth and opacity of the photo image (i.e. to place the image in front of the model, and adjust opacity to be able to see through the image to the model). Now we know from this post that Rhino, so far, can’t do that.
I have to continue using Blender instead of Rhino (I kind of have to because of Rhino’s too-basic option for Wallpaper), but since my model is in nurbs and Blender is mesh (I prefer nurbs), it’s a lot of to-and-fro round-trips using Nathan’s Rhino-to-Blender importer one way, and a doppelganger mesh model going back the other way, to iterate to the final product. It’s a pity that for now, Rhino’s wallpaper can’t be adjusted in those two ways – depth and opacity; otherwise I would adapt Jaworski’s method to Rhino.