Helping You Connect the Dots to Succeed Faster
WGAN-TV: Now Playing
WGAN-TV: Now Playing
Free WGAN Map
Locations of Matterport Pro3 Camera Service Providers and see the number of Matterport Pro3s and/or BLK360s for each Matterport Pro.
View WGAN Map
Contact Info
Locations of Matterport Pro3 Camera Service Providers and see name, company, website, email and mobile phone for each Matterport Pro.
Join WGAN Sponsor
Get on the Map | A Service of We Get Around Network (not affiliated with Matterport)
One Order  |  One Quote  |  One Contact
Book Multiple GLOBAL Commercial Locations
  • ✔  As-Builts
  • ✔  Construction Progress
  • ✔  Facilities Management
Last 24 Hours: 506 Unique Visitors
9,043 WGAN Members in 148 Countries
Last 30 Days: 33,689 Page Views | 18,535 Unique Visitors | 24 New Members
We Get Around Network Forum
Quick Start | WGAN Forum
CaptureiOSLidarPro3Rotator

Matterport: IOS lidar scanning and no stitching errors.14365

WGAN Fan
Club Member
Queensland, Australia
Wingman private msg quote post Address this user
I got this automatic pan/tilt head delivered today.




It was cheap to buy and I wanted to test a theory that with a proper head you can produce a great tour even with an Iphone/Ipad.
To be honest I was torn between buying a good phone holder and building a manual rotation system with ability to pan in full 360 degree and tilt up and down to make a complex Matterport scan. However a good phone holder cold cost at least half than I could get this head for. So I went with the head.

It does not have too many advantages for matterport as an motorised tilt/pan head because:

1)it is not full 360 degree pan, it only goes 350-355 degree. As a result when you need to match a circle with a dot in the capture app you may hit the limit in pan and you will need to start rotating a tripod
2) there is no control of a speed of panning or tilting. It does not look fast but it is still hard to stop a dot in a circle... it is always a miss. You can try to match but from a small experience it is better just slightly rotate the whole head or a tripod. When angle in tilting does not match you have to tilt the head manually.

so there is no benefit in terms you can capture a scan faster with it. I know these heads are hackable and you may even try to write a script that will pan and tilt automatically with precise degrees and finish one scan(even a complex one) with just one click. But in a way it is supplied it is what it is. without any mods or script you can do it faster with manual nodal head.


3)Unfortunately a holder for a phone on it is quite small. I could put my Iphone 12 Pro in it only without a case and with some force. Anything wider than Iphone 12 pro won't fit at all.

4)The phone lens is still offset in it from an axis of rotation. Not much, may be just 1-2cm and it can be fixed but I guess the whole arm or phone holder needs to be rebuilt. Matterport likes this offset and says it is a good rotation with 8-9cm offset from where it should be. Not sure why they say 8-9cm while it is barely 1-2cm but they are not even offering to redo them


Now about capturing Matterport with the head.

If you have updated your capture app the current production version has Iphone Lidar scan. Not sure why Matterport added it. It may assist you with scanning in the sun and generate more mesh but you won't be able to use that scan point as a walk point because it just captures horizontal one row view. everything up and down will be blurred. So after assisting with scanning with sunlight this way you will either need to do another scan. It is either complex with IOS device or use another camera Pro2, Z1 to capture full height panorama.

if you use IOS Lidar it is unpredictable where you can do your next scan. I believe the biggest problem is that matterport is doing only one row scan for IOS Lidar so it barely sees things down and up. it onlys sees what is in front of IOS device used for scanning. Outside it can be a problem as it can be a garden or some big lawn area where is no much things close by. As a result sometimes you can do your next scan 3-5 meters away and it is going to be aligned but most of the time you need to be closer to a previous scan point.

Adding a complex scan to scans done with LiDAR can work with alignment but it is screw up a placement of a complex scan in relation to those done with IOS LiDAR. I guess the best will be to use a complex scanning and do not use LiDAR scanning at all... or use IOS lidar to assist scanning outdoor in sunlight so you can add a Pro2 walking points later.


Now about stitching, it is not really a theory that if you do not rotate a lens around its nodal point you will get stitching errors. So I was not really trying to prove anything just wanted to show it is possible to use IOS devices with matterport without generating a lot of stitching errors.




the tour starts with points done with built in Iphone 12 LiDAR. In fact all points apart from the last done the same way.

Only this https://my.matterport.com/show/?m=sDqSbUbq7Ry&sr=-1.79,1.19&ss=6 is done with complex scan which you can guess if you look up and down and see very little blur comparing to any point before. Also the 360 attached was done as a complex scan.

As you can see there is a very small amount of stitching errors and they are not even that noticeable. I would be surprised to see none because the lens is still offset 1-2cm from a rotation axis.

The strange thing and it is a good thing though even the first Lidar scan point has been done with no void present that any Matterport camera would produce as a black hexagon for the first scan. I do not know why since the phone is only capturing front view and cannot see below where the tripod is.


Would I use the head with Matterport? May be if I can change where the Phone is sitting- it needs to be moved further from the arm by 1-2cm. And only if I can change a speed of rotation.
Otherwise I may just take the phone holder from it and see if I can adapt it to be used on the following Neewer head.


Post 1 IP   flag post
WGAN Fan
Club Member
Queensland, Australia
Wingman private msg quote post Address this user
Silly me for not trying buttons on the head. You can actually control a speed of panning and tilting.
Post 2 IP   flag post
Matterport
Camera
Repair Service
Gainesville, Florida
MatterFix private msg quote post Address this user
Thanks for all the info! I have to believe that someone is going to come out with an autohead that is set up specifically for IOS Matterport scanning....eventually some future variation of this setup may be the next MP camera.
Post 3 IP   flag post
WGAN
3rd Party
Service
Member
Beijing
JuMP private msg quote post Address this user
@Wingman Thank you for the testing.
I send you the panos from your test showcase.
Maybe you like to share them here.
It is 8K x 4K.
Can't get 16K x 8K from iphone showcase database.
Post 4 IP   flag post
WGAN Forum
Founder &
WGAN-TV Podcast
Host
Atlanta, Georgia
DanSmigrod private msg quote post Address this user
@Wingman

Great idea.

I have to imagine that Matterport is working on a smartphone rotator (like GeoCV did).

This solution could be Matterport Pro3 Camera?

Best,
Post 5 IP   flag post
bethereeu private msg quote post Address this user
Seems like lidar has no effect on 3D quality. So there is no real difference between 11 and 12 iphone. This scan created almost a year ago. - https://my.matterport.com/show/?m=N4w6441WTmp

Post 6 IP   flag post
Tosolini
Productions
Bellevue, Washington
Tosolini private msg quote post Address this user
@Wingman thanks for the R&D on this new hardware and the extensive write-up
Post 7 IP   flag post
WGAN Fan
Club Member
Queensland, Australia
Wingman private msg quote post Address this user
Quote:
Originally Posted by DanSmigrod
@Wingman

I have to imagine that Matterport is working on a smartphone rotator (like GeoCV did).


I do not think they will ever do it. If they make one and offer it for IOS scanning they may lose a lot in sales for their own cameras. However photographers or 3rd party can make it because it may be as simple as coding some existing rotators or simply build one based on arduino for example.

JuMP got 360 photos extracted and they are slightly more than 32MP for LiDAR mode scans. I have just asked them to get the last scan point (https://my.matterport.com/show/?m=sDqSbUbq7Ry&sr=.34,1.32&ss=6) extracted as 360 photo to see if its resolution is better. It has been done with a complex 3D scan on Iphone 12 Pro. Since it is a three rows scanning it may give much better resolution than anything else related to IOS scanning.
Post 8 IP   flag post
WGAN Fan
Club Member
Queensland, Australia
Wingman private msg quote post Address this user
After checking all 360 supplied by Jump they are all about 32-33MP so it is not near a quality of a Pro2.
I need to do it with Ipad because by the look of it when I did a tour around my pool it did look as good as Pro2. Unfortunately I deleted this tour from my account and I will need to process it again.

technically both Ipad 2020 Pro and Iphone 12 Pro has 12mp imaging sensor at their back side and it should be no difference which one is used for scanning. However considering that so many pictures(I believe there are 18) over 3 rows capturing with 12MP just cannot make it with a maximum of 32 mp resolution. It should be much more than that. If I add bad quality coming from Iphone 12 Pro for insideMaps system I can guess that there may be something wrong with Iphone 12 Pro default camera settings, the way capturing is handled through API or whatever is used by Matterport or InsideMaps to control Iphone 12 Pro camera and transferring images into their system.
Post 9 IP   flag post
101571 9 9
This topic is archived. Start new topic?